Neurophysiological signals are manifestations of the underlying brain activity, and they contain an abundance of neural information. The decoding and understanding of these signals is useful to develop robotic exoskeletons, benefitting device-aided motor rehabilitation. To this date, numerous efforts have been carried out to explore the relations between neurophysiological signals and locomotor capacity. Most of these studies focused on a single modality of neurophysiological signal and ignored its multiple modalities. In this study, the modalities from two kinds of biosensors were fused [electroencephalogram (EEG) and electromyogram (EMG)], and a novel deep learning model was proposed (multi-scale learning, MSL) to classify four walking patterns. The EEG and EMG data were collected during a walking experiment, where different walking conditions with and without exoskeleton-aided assistance were implemented (i.e. free-walking and exoskeleton-aided walking at zero, low, and high assistive forces). The performance achieved by the MSL model was compared to that of existing models, and the results show that multimodal MSL achieved the highest performance in terms of classification accuracy (89.33%). Moreover, the comparisons in our study show that an improved classification performance was obtained when a full 62-channel EEG setting was used compared to using a subset of 20 channels located on the sensorimotor region. This work contributes to the improvement of neurophysiological signal decoding and promotes the development of rehabilitation technologies as well as exoskeleton-aided applications.