A time-frequency feature fusion and collaborative classification method for motion decoding with EEG-fNIRS signals
DOI:
Author:
Affiliation:

Clc Number:

TP391 TH776

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Functional neural imaging technology can reflect the physiological change of the brain, and decode the movement state. However, the information by the single neural imaging modality is limited. In this article, a time-frequency feature fusion and collaborative classification method is proposed to achieve high precision motion state decoding with EEG and fNIRS signals, which takes the advantage of the complementation of electrical activity and hemoglobin changes. Firstly, the wavelet packet energy entropy feature of the EEG signal is extracted, the Bi-LSTM deep neural network is used to extract the time domain features of the fNIRS signal, and the achieved features are combined to obtain the fusion features containing the time-frequency domain information. The complementation of EEG and fNIRS features is achieved at multiple levels. Then, the 1DCNN is used to extract deep-level information from the fusion features. Finally, a fully connected neural network is used for classification. The proposed method has been tested with a public dataset. The EEG-fNIRS collaborative classification method achieves the accuracy of 95. 31% , which is 7. 81% ~ 9. 60% higher than those of single-modal signal classification methods. Experimental results show that this method fully integrates the timefrequency domain information of two physiologically complementary signals, and improves the classification accuracy of left and right hand grip tasks.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: February 06,2023
  • Published: