Learning a robust unified domain adaptation framework for cross-subject EEG-based emotion recognition

被引:8
|
作者
Jimenez-Guarneros, Magdiel [1 ]
Fuentes-Pineda, Gibran [1 ]
机构
[1] Univ Nacl Autonoma Mexico, Dept Comp Sci, Inst Invest Matemat Aplicadas & Sistemas, Circuito Escolar S-N,Ciudad Univ, Mexico City 04510, Mexico
关键词
Unsupervised domain adaptation; Deep learning; Emotion recognition; Electroencephalogram; NEURAL-NETWORKS;
D O I
10.1016/j.bspc.2023.105138
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Over the last few years, unsupervised domain adaptation (UDA) based on deep learning has emerged as a solution to build cross-subject emotion recognition models from Electroencephalogram (EEG) signals, aligning the subject distributions within a latent feature space. However, most reported works have a common intrinsic limitation: the subject distribution alignment is coarse-grained, but not all of the feature space is shared between subjects. In this paper, we propose a robust unified domain adaptation framework, named Multi-source Feature Alignment and Label Rectification (MFA-LR), which performs a fine-grained domain alignment at subject and class levels, while inter-class separation and robustness against input perturbations are encouraged in coarse grain. As a complementary step, a pseudo-labeling correction procedure is used to rectify mislabeled target samples. Our proposal was assessed over two public datasets, SEED and SEED-IV, on each of the three available sessions, using leave-one-subject-out cross-validation. Experimental results show an accuracy performance of up to 89.11 & PLUSMN; 07.72% and 74.99 & PLUSMN; 12.10% for the best session on SEED and SEED-IV, as well as an average accuracy of 85.27% and 69.58% on all three sessions, outperforming state-of-the-art results.
引用
下载
收藏
页数:13
相关论文
共 50 条
  • [31] Hybrid transfer learning strategy for cross-subject EEG emotion recognition
    Lu, Wei
    Liu, Haiyan
    Ma, Hua
    Tan, Tien-Ping
    Xia, Lingnan
    FRONTIERS IN HUMAN NEUROSCIENCE, 2023, 17
  • [32] Multi-source Selective Graph Domain Adaptation Network for cross-subject EEG emotion recognition
    Wang, Jing
    Ning, Xiaojun
    Xu, Wei
    Li, Yunze
    Jia, Ziyu
    Lin, Youfang
    Neural Networks, 2024, 180
  • [33] Gusa: Graph-Based Unsupervised Subdomain Adaptation for Cross-Subject EEG Emotion Recognition
    Li, Xiaojun
    Chen, C. L. Philip
    Chen, Bianna
    Zhang, Tong
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2024, 15 (03) : 1451 - 1462
  • [34] FMLAN: A novel framework for cross-subject and cross-session EEG emotion recognition
    Yu, Peng
    He, Xiaopeng
    Li, Haoyu
    Dou, Haowen
    Tan, Yeyu
    Wu, Hao
    Chen, Badong
    Biomedical Signal Processing and Control, 2025, 100
  • [35] Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition
    Shen, Xinke
    Liu, Xianggen
    Hu, Xin
    Zhang, Dan
    Song, Sen
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 2496 - 2511
  • [36] Exploring EEG Features in Cross-Subject Emotion Recognition
    Li, Xiang
    Song, Dawei
    Zhang, Peng
    Zhang, Yazhou
    Hou, Yuexian
    Hu, Bin
    FRONTIERS IN NEUROSCIENCE, 2018, 12
  • [37] Easy Domain Adaptation for cross-subject multi-view emotion recognition
    Chen, Chuangquan
    Vong, Chi-Man
    Wang, Shitong
    Wang, Hongtao
    Pang, Miaoqi
    KNOWLEDGE-BASED SYSTEMS, 2022, 239
  • [38] Self-supervised contrastive learning for EEG-based cross-subject motor imagery recognition
    Li, Wenjie
    Li, Haoyu
    Sun, Xinlin
    Kang, Huicong
    An, Shan
    Wang, Guoxin
    Gao, Zhongke
    JOURNAL OF NEURAL ENGINEERING, 2024, 21 (02)
  • [39] EEG-based Emotion Recognition Using Domain Adaptation Network
    Jin, Yi-Ming
    Luo, Yu-Dong
    Zheng, Wei-Long
    Lu, Bao-Liang
    PROCEEDINGS OF THE 2017 INTERNATIONAL CONFERENCE ON ORANGE TECHNOLOGIES (ICOT), 2017, : 222 - 225
  • [40] Cross-Subject EEG Signal Recognition Using Deep Domain Adaptation Network
    Hang, Wenlong
    Feng, Wei
    Du, Ruoyu
    Liang, Shuang
    Chen, Yan
    Wang, Qiong
    Liu, Xuejun
    IEEE ACCESS, 2019, 7 : 128273 - 128282