Subject matching for cross-subject EEG-based recognition of driver states related to situation awareness

被引:20
|
作者
Li, Ruilin [1 ,2 ]
Wang, Lipo [1 ]
Sourina, Olga [2 ]
机构
[1] Nanyang Technol Univ, Sch Elect & Elect Engn, Singapore 639798, Singapore
[2] Fraunhofer Singapore, Singapore 639798, Singapore
基金
新加坡国家研究基金会;
关键词
Situation awareness; Electroencephalography (EEG); Transfer learning; Machine learning; Classification; BATCH NORMALIZATION; CLASSIFICATION;
D O I
10.1016/j.ymeth.2021.04.009
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Situation awareness (SA) has received much attention in recent years because of its importance for operators of dynamic systems. Electroencephalography (EEG) can be used to measure mental states of operators related to SA. However, cross-subject EEG-based SA recognition is a critical challenge, as data distributions of different subjects vary significantly. Subject variability is considered as a domain shift problem. Several attempts have been made to find domain-invariant features among subjects, where subject-specific information is neglected. In this work, we propose a simple but efficient subject matching framework by finding a connection between a target (test) subject and source (training) subjects. Specifically, the framework includes two stages: (1) we train the model with multi-source domain alignment layers to collect source domain statistics. (2) During testing, a distance is computed to perform subject matching in the latent representation space. We use a reciprocal exponential function as a similarity measure to dynamically select similar source subjects. Experiment results show that our framework achieves a state-of-the-art accuracy 74.32% for the Taiwan driving dataset.
引用
收藏
页码:136 / 143
页数:8
相关论文
共 50 条
  • [31] Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition
    Shen, Xinke
    Liu, Xianggen
    Hu, Xin
    Zhang, Dan
    Song, Sen
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 2496 - 2511
  • [32] Building Cross-Subject EEG-Based Affective Models Using Heterogeneous Transfer Learning
    Zheng W.-L.
    Shi Z.-F.
    Lv B.-L.
    Lv, Bao-Liang (bllu@sjtu.edu.cn), 1600, Science Press (43): : 177 - 189
  • [33] Multisource Transfer Learning for Cross-Subject EEG Emotion Recognition
    Li, Jinpeng
    Qiu, Shuang
    Shen, Yuan-Yuan
    Liu, Cheng-Lin
    He, Huiguang
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (07) : 3281 - 3293
  • [34] Cross-subject emotion EEG signal recognition based on source microstate analysis
    Zhang, Lei
    Xiao, Di
    Guo, Xiaojing
    Li, Fan
    Liang, Wen
    Zhou, Bangyan
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [35] Subject Adaptive EEG-Based Visual Recognition
    Lee, Pilhyeon
    Hwang, Sunhee
    Jeon, Seogkyu
    Byun, Hyeran
    PATTERN RECOGNITION, ACPR 2021, PT II, 2022, 13189 : 322 - 334
  • [36] Cross-Subject Cognitive Workload Recognition Based on EEG and Deep Domain Adaptation
    Zhou, Yueying
    Wang, Pengpai
    Gong, Peiliang
    Wei, Fulin
    Wen, Xuyun
    Wu, Xia
    Zhang, Daoqiang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [37] Cross-Subject Emotion Recognition Based on Domain Similarity of EEG Signal Transfer
    Ma, Yuliang
    Zhao, Weicheng
    Meng, Ming
    Zhang, Qizhong
    She, Qingshan
    Zhang, Jianhai
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2023, 31 : 936 - 943
  • [38] EEG-based cross-subject passive music pitch perception using deep learning models
    Qiang Meng
    Lan Tian
    Guoyang Liu
    Xue Zhang
    Cognitive Neurodynamics, 2025, 19 (1)
  • [39] InstanceEasyTL: An Improved Transfer-Learning Method for EEG-Based Cross-Subject Fatigue Detection
    Zeng, Hong
    Zhang, Jiaming
    Zakaria, Wael
    Babiloni, Fabio
    Gianluca, Borghini
    Li, Xiufeng
    Kong, Wanzeng
    SENSORS, 2020, 20 (24) : 1 - 17
  • [40] Hybrid transfer learning strategy for cross-subject EEG emotion recognition
    Lu, Wei
    Liu, Haiyan
    Ma, Hua
    Tan, Tien-Ping
    Xia, Lingnan
    FRONTIERS IN HUMAN NEUROSCIENCE, 2023, 17