Cross-subject driver fatigue detection from single-channel EEG with an interpretable deep learning model
DOI:
Author:
Affiliation:

Clc Number:

TP391 R741. 044 TH79

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Electroencephalography (EEG) has been considered as one of the best physiological signals to detect drivers′ mental fatigue. However, EEG signals vary significantly among different subjects and recording sessions, and it is still challenging to design a calibration-free system for EEG fatigue detection. In recent years, many deep learning-based methods have been developed to address this issue and achieve significant progress. However, the “ black-box” nature of deep learning models makes their decisions mistrust. Therefore, an interpretable deep learning model is proposed to recognize cross-subject fatigue states from single-channel EEG signals in this article. The model has a compact network structure. Firstly, a shallow CNN is designed to extract the EEG features. Then, the adaptive feature recalibration mechanism is introduced to enhance the features extraction ability. Finally, the time series of extracted features are linked to classification with LSTM. The interpretable information of the classified decision is achieved through a visualization technique that is taking advantage of hidden states output by the LSTM layer. Extensive cross-subject experiments are implemented on an open EEG dataset with a sustained-attention driving task, and the proposed model achieve the highest average accuracy of 76. 26% . In addition, compared with the advanced compact deep learning models, the parameters and computation are effectively reduced. Visualization results indicate that the proposed model has discovered neuro-physiologically reliable interpretation.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:
  • Revised:
  • Adopted:
  • Online: August 17,2023
  • Published: