Multisource Associate Domain Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition

被引:14
|
作者
She, Qingshan [1 ]
Zhang, Chenqi [2 ]
Fang, Feng [3 ]
Ma, Yuliang [1 ]
Zhang, Yingchun [3 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Zhejiang, Peoples R China
[2] Hangzhou Dianzi Univ, HDU ITMO Joint Inst, Hangzhou 310018, Zhejiang, Peoples R China
[3] Univ Houston, Dept Biomed Engn, Houston, TX 77204 USA
基金
中国国家自然科学基金;
关键词
Feature extraction; Emotion recognition; Electroencephalography; Brain modeling; Adaptation models; Data models; Data mining; Domain adaptation (DA); electroencephalogram (EEG); emotion recognition; transfer learning; DIFFERENTIAL ENTROPY FEATURE;
D O I
10.1109/TIM.2023.3277985
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Emotion recognition is important in the application of brain-computer interface (BCI). Building a robust emotion recognition model across subjects and sessions is critical in emotion-based BCI systems. Electroencephalogram (EEG) is a widely used tool to recognize different emotion states. However, EEG has disadvantages such as small amplitude, low signal-to-noise ratio, and nonstationary properties, resulting in large differences across subjects. To solve these problems, this article proposes a new emotion recognition method based on a multisource associate domain adaptation (DA) network, considering both domain invariant and domain-specific features. First, separate branches were constructed for multiple source domains, assuming that different EEG data shared the same low-level features. Second, the domain-specific features were extracted using the one-to-one associate DA. Then, the weighted scores of specific sources were obtained according to the distribution distance, and multiple source classifiers were deduced with the corresponding weighted scores. Finally, EEG emotion recognition experiments were conducted on different datasets, including SEED, DEAP, and SEED-IV dataset. Results indicated that, in the cross-subject experiment, the average accuracy in SEED dataset was 86.16%, DEAP dataset was 65.59%, and SEED-IV was 59.29%. In the cross-session experiment, the accuracies of SEED and SEED-IV datasets were 91.10% and 66.68%, respectively. Our proposed method has achieved better classification results compared to the state-of-the-art DA methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] Easy Domain Adaptation for cross-subject multi-view emotion recognition
    Chen, Chuangquan
    Vong, Chi-Man
    Wang, Shitong
    Wang, Hongtao
    Pang, Miaoqi
    KNOWLEDGE-BASED SYSTEMS, 2022, 239
  • [22] Cross-Subject EEG Signal Recognition Using Deep Domain Adaptation Network
    Hang, Wenlong
    Feng, Wei
    Du, Ruoyu
    Liang, Shuang
    Chen, Yan
    Wang, Qiong
    Liu, Xuejun
    IEEE ACCESS, 2019, 7 : 128273 - 128282
  • [23] Cross-Subject Cognitive Workload Recognition Based on EEG and Deep Domain Adaptation
    Zhou, Yueying
    Wang, Pengpai
    Gong, Peiliang
    Wei, Fulin
    Wen, Xuyun
    Wu, Xia
    Zhang, Daoqiang
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2023, 72
  • [24] Comprehensive Multisource Learning Network for Cross-Subject Multimodal Emotion Recognition
    Chen, Chuangquan
    Li, Zhencheng
    Kou, Kit Ian
    Du, Jie
    Li, Chen
    Wang, Hongtao
    Vong, Chi-Man
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, : 1 - 16
  • [25] Cross-Subject Emotion Recognition Based on Domain Similarity of EEG Signal Transfer Learning
    Ma, Yuliang
    Zhao, Weicheng
    Meng, Ming
    Zhang, Qizhong
    She, Qingshan
    Zhang, Jianhai
    IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2023, 31 : 936 - 943
  • [26] Online Cross-subject Emotion Recognition from ECG via Unsupervised Domain Adaptation
    He, Wenwen
    Ye, Yalan
    Li, Yunxia
    Pan, Tongjie
    Lu, Li
    2021 43RD ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY (EMBC), 2021, : 1001 - 1005
  • [27] Cross-Subject Emotion Recognition Using Deep Adaptation Networks
    Li, He
    Jin, Yi-Ming
    Zheng, Wei-Long
    Lu, Bao-Liang
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT V, 2018, 11305 : 403 - 413
  • [28] JOINT TEMPORAL CONVOLUTIONAL NETWORKS AND ADVERSARIAL DISCRIMINATIVE DOMAIN ADAPTATION FOR EEG-BASED CROSS-SUBJECT EMOTION RECOGNITION
    He, Zhipeng
    Zhong, Yongshi
    Pan, Jiahui
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3214 - 3218
  • [29] MGFKD: A semi-supervised multi-source domain adaptation algorithm for cross-subject EEG emotion recognition
    Zhang, Rui
    Guo, Huifeng
    Xu, Zongxin
    Hu, Yuxia
    Chen, Mingming
    Zhang, Lipeng
    BRAIN RESEARCH BULLETIN, 2024, 208
  • [30] Gusa: Graph-Based Unsupervised Subdomain Adaptation for Cross-Subject EEG Emotion Recognition
    Li, Xiaojun
    Chen, C. L. Philip
    Chen, Bianna
    Zhang, Tong
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2024, 15 (03) : 1451 - 1462