Multisource Associate Domain Adaptation for Cross-Subject and Cross-Session EEG Emotion Recognition

被引:35
|
作者
She, Qingshan [1 ]
Zhang, Chenqi [2 ]
Fang, Feng [3 ]
Ma, Yuliang [1 ]
Zhang, Yingchun [3 ]
机构
[1] Hangzhou Dianzi Univ, Sch Automat, Hangzhou 310018, Zhejiang, Peoples R China
[2] Hangzhou Dianzi Univ, HDU ITMO Joint Inst, Hangzhou 310018, Zhejiang, Peoples R China
[3] Univ Houston, Dept Biomed Engn, Houston, TX 77204 USA
基金
中国国家自然科学基金;
关键词
Feature extraction; Emotion recognition; Electroencephalography; Brain modeling; Adaptation models; Data models; Data mining; Domain adaptation (DA); electroencephalogram (EEG); emotion recognition; transfer learning; DIFFERENTIAL ENTROPY FEATURE;
D O I
10.1109/TIM.2023.3277985
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Emotion recognition is important in the application of brain-computer interface (BCI). Building a robust emotion recognition model across subjects and sessions is critical in emotion-based BCI systems. Electroencephalogram (EEG) is a widely used tool to recognize different emotion states. However, EEG has disadvantages such as small amplitude, low signal-to-noise ratio, and nonstationary properties, resulting in large differences across subjects. To solve these problems, this article proposes a new emotion recognition method based on a multisource associate domain adaptation (DA) network, considering both domain invariant and domain-specific features. First, separate branches were constructed for multiple source domains, assuming that different EEG data shared the same low-level features. Second, the domain-specific features were extracted using the one-to-one associate DA. Then, the weighted scores of specific sources were obtained according to the distribution distance, and multiple source classifiers were deduced with the corresponding weighted scores. Finally, EEG emotion recognition experiments were conducted on different datasets, including SEED, DEAP, and SEED-IV dataset. Results indicated that, in the cross-subject experiment, the average accuracy in SEED dataset was 86.16%, DEAP dataset was 65.59%, and SEED-IV was 59.29%. In the cross-session experiment, the accuracies of SEED and SEED-IV datasets were 91.10% and 66.68%, respectively. Our proposed method has achieved better classification results compared to the state-of-the-art DA methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] Cross-Subject Emotion Recognition Using Deep Adaptation Networks
    Li, He
    Jin, Yi-Ming
    Zheng, Wei-Long
    Lu, Bao-Liang
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT V, 2018, 11305 : 403 - 413
  • [32] JOINT TEMPORAL CONVOLUTIONAL NETWORKS AND ADVERSARIAL DISCRIMINATIVE DOMAIN ADAPTATION FOR EEG-BASED CROSS-SUBJECT EMOTION RECOGNITION
    He, Zhipeng
    Zhong, Yongshi
    Pan, Jiahui
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3214 - 3218
  • [33] MGFKD: A semi-supervised multi-source domain adaptation algorithm for cross-subject EEG emotion recognition
    Zhang, Rui
    Guo, Huifeng
    Xu, Zongxin
    Hu, Yuxia
    Chen, Mingming
    Zhang, Lipeng
    BRAIN RESEARCH BULLETIN, 2024, 208
  • [34] Gusa: Graph-Based Unsupervised Subdomain Adaptation for Cross-Subject EEG Emotion Recognition
    Li, Xiaojun
    Chen, C. L. Philip
    Chen, Bianna
    Zhang, Tong
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2024, 15 (03) : 1451 - 1462
  • [35] Cross-Subject Emotion Recognition Using Fused Entropy Features of EEG
    Zuo, Xin
    Zhang, Chi
    Hamalainen, Timo
    Gao, Hanbing
    Fu, Yu
    Cong, Fengyu
    ENTROPY, 2022, 24 (09)
  • [36] Joint EEG Feature Transfer and Semisupervised Cross-Subject Emotion Recognition
    Peng, Yong
    Liu, Honggang
    Kong, Wanzeng
    Nie, Feiping
    Lu, Bao-Liang
    Cichocki, Andrzej
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2023, 19 (07) : 8104 - 8115
  • [37] Hybrid transfer learning strategy for cross-subject EEG emotion recognition
    Lu, Wei
    Liu, Haiyan
    Ma, Hua
    Tan, Tien-Ping
    Xia, Lingnan
    FRONTIERS IN HUMAN NEUROSCIENCE, 2023, 17
  • [38] Contrastive Learning of Subject-Invariant EEG Representations for Cross-Subject Emotion Recognition
    Shen, Xinke
    Liu, Xianggen
    Hu, Xin
    Zhang, Dan
    Song, Sen
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2023, 14 (03) : 2496 - 2511
  • [39] Cross-subject EEG emotion recognition using multi-source domain manifold feature selection
    She, Qingshan
    Shi, Xinsheng
    Fang, Feng
    Ma, Yuliang
    Zhang, Yingchun
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 159
  • [40] Cross-subject and Cross-gender Emotion Classification from EEG
    Zhu, Jia-Yi
    Zheng, Wei-Long
    Lu, Bao-Liang
    WORLD CONGRESS ON MEDICAL PHYSICS AND BIOMEDICAL ENGINEERING, 2015, VOLS 1 AND 2, 2015, 51 : 1188 - 1191