Background: Decoding hand movement parameters from electroencephalograms (EEG) signals can provide intuitive control for brain-computer interfaces (BCIs). However, most existing studies of EEG-based hand movement decoding are focused on single hand movement. Since the both-hand movement is common in human augmentation systems, to address the decoding of hand movement under the opposite hand movement, we investigate the neural signatures and decoding of the primary hand movement direction from EEG signals under the opposite hand movement. Methods: The decoding model was developed by using an echo state network (ESN) to extract nonlinear dynamics parameters of movement-related cortical potentials (MRCPs) as decoding features and linear discriminant analysis as a classifier.
Results: Significant differences in MRCPs between movement conditions with and without an opposite hand movement were found. Furthermore, using the ESN-based models, the decoding accuracies reached 86.03± 7.32% and 88.45± 6.16% under the conditions without and with the opposite hand movement, 20 respectively.
Conclusions: These findings showed that the proposed method performed well in decoding the primary hand movement directions under the conditions with and without the opposite hand movement. This study may open a new avenue to decode hand movement parameters from EEG signals and lay a foundation for the future development of BCI-based human augmentation systems.