EEG-Based Subject-Independent Emotion Recognition Using Gated Recurrent Unit and Minimum Class Confusion

被引:18
|
作者
Cui, Heng [1 ,2 ]
Liu, Aiping [2 ]
Zhang, Xu [2 ]
Chen, Xiang [2 ]
Liu, Jun [2 ]
Chen, Xun [1 ,3 ]
机构
[1] Univ Sci & Technol China USTC, Affiliated Hosp 1, Dept Neurosurg, Div Life Sci & Med, Hefei 230001, Anhui, Peoples R China
[2] Univ Sci & Technol China, Sch Informat Sci & Technol, Hefei 230027, Anhui, Peoples R China
[3] Univ Sci & Technol China, Inst Adv Technol, USTC IAT Huami Joint Lab Brain Machine Intelligenc, Hefei 230088, Anhui, Peoples R China
基金
中国国家自然科学基金;
关键词
Electroencephalogram (EEG); emotion recognition; subject-independent; gated recurrent unit (GRU); minimum class confusion (MCC); BODY GESTURE; SYSTEM;
D O I
10.1109/TAFFC.2022.3179717
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Automatic emotion recognition based on electroencephalogram (EEG) has attracted rapidly increasing interests. Due to large inter-subject variabilities, subject-independent emotion recognition faces great challenges. Recently, domain adaptation methods have been successfully applied in this field due to their ability to align features from different subjects. However, since EEG signals corresponding to some emotions have similar oscillation patterns, they are often confused and aligned to the wrong categories, which limits the generalization ability of the model across subjects. Besides, almost all methods only support offline applications, which require collecting a large number of samples of new subjects. To achieve online recognition, a simpler model is needed. In this paper, a novel Gated Recurrent Unit-Minimum Class Confusion (GRU-MCC) model is proposed. Specifically, a simple feature extractor based on gated recurrent unit (GRU) is firstly applied to model the spatial dependence of multiple electrodes and obtain high-level discriminative features. Then, during training, minimum class confusion (MCC) loss is introduced to reduce the confusion between the correct and ambiguous classes for the target subject and increase the transfer gains. We conduct both offline and online experiments on two public datasets: SEED and MPED. The results indicate that our method can obtain the superior performance.
引用
收藏
页码:2740 / 2750
页数:11
相关论文
共 50 条
  • [1] Subject-Independent EEG-based Emotion Recognition using Adversarial Learning
    Hwang, Sunhee
    Ki, Minsong
    Hong, Kibeom
    Byun, Hyeran
    [J]. 2020 8TH INTERNATIONAL WINTER CONFERENCE ON BRAIN-COMPUTER INTERFACE (BCI), 2020, : 99 - 102
  • [2] Multimodal Deep Learning Model for Subject-Independent EEG-based Emotion Recognition
    Dharia, Shyamal Y.
    Valderrama, Camilo E.
    Camorlinga, Sergio G.
    [J]. 2023 IEEE CANADIAN CONFERENCE ON ELECTRICAL AND COMPUTER ENGINEERING, CCECE, 2023,
  • [3] Dynamic Stream Selection Network for Subject-Independent EEG-Based Emotion Recognition
    Li, Wei
    Dong, Jianzhang
    Liu, Shuxia
    Fan, Lingmin
    Wang, Siyi
    [J]. IEEE SENSORS JOURNAL, 2024, 24 (12) : 19336 - 19343
  • [4] DAGAM: a domain adversarial graph attention model for subject-independent EEG-based emotion recognition
    Xu, Tao
    Dang, Wang
    Wang, Jiabao
    Zhou, Yun
    [J]. JOURNAL OF NEURAL ENGINEERING, 2023, 20 (01)
  • [5] Subject-independent Emotion recognition based on Entropy of EEG Signals
    Yang, Haihui
    Rong, Panxiang
    Sun, Guobing
    [J]. PROCEEDINGS OF THE 33RD CHINESE CONTROL AND DECISION CONFERENCE (CCDC 2021), 2021, : 1513 - 1518
  • [6] An EEG-based subject-independent emotion recognition model using a differential-evolution-based feature selection algorithm
    K. Kannadasan
    Sridevi Veerasingam
    B. Shameedha Begum
    N. Ramasubramanian
    [J]. Knowledge and Information Systems, 2023, 65 : 341 - 377
  • [7] An EEG-based subject-independent emotion recognition model using a differential-evolution-based feature selection algorithm
    Kannadasan, K.
    Veerasingam, Sridevi
    Begum, B. Shameedha
    Ramasubramanian, N.
    [J]. KNOWLEDGE AND INFORMATION SYSTEMS, 2023, 65 (01) : 341 - 377
  • [8] STSNet: a novel spatio-temporal-spectral network for subject-independent EEG-based emotion recognition
    Li, Rui
    Ren, Chao
    Zhang, Sipo
    Yang, Yikun
    Zhao, Qiqi
    Hou, Kechen
    Yuan, Wenjie
    Zhang, Xiaowei
    Hu, Bin
    [J]. HEALTH INFORMATION SCIENCE AND SYSTEMS, 2023, 11 (01)
  • [9] STSNet: a novel spatio-temporal-spectral network for subject-independent EEG-based emotion recognition
    Rui Li
    Chao Ren
    Sipo Zhang
    Yikun Yang
    Qiqi Zhao
    Kechen Hou
    Wenjie Yuan
    Xiaowei Zhang
    Bin Hu
    [J]. Health Information Science and Systems, 11
  • [10] Convolutional gated recurrent unit-driven multidimensional dynamic graph neural network for subject-independent emotion recognition
    Guo, Wenhui
    Wang, Yanjiang
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238