Cross-Subject Emotion Recognition Using Deep Adaptation Networks

被引:94
|
作者
Li, He [1 ]
Jin, Yi-Ming [1 ]
Zheng, Wei-Long [1 ]
Lu, Bao-Liang [1 ,2 ,3 ]
机构
[1] Shanghai Jiao Tong Univ, Ctr Brain Comp & Machine Intelligence, Dept Comp Sci & Engn, 800 Dong Chuan Rd, Shanghai 200240, Peoples R China
[2] Shanghai Jiao Tong Univ, Key Lab Shanghai Educ Commiss Intelligent Interac, 800 Dong Chuan Rd, Shanghai 200240, Peoples R China
[3] Shanghai Jiao Tong Univ, Brain Sci & Technol Res Ctr, 800 Dong Chuan Rd, Shanghai 200240, Peoples R China
基金
中国国家自然科学基金;
关键词
Affective brain-computer interface; Emotion recognition; EEG; Deep neural network; Domain adaptation;
D O I
10.1007/978-3-030-04221-9_36
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Affective models based on EEG signals have been proposed in recent years. However, most of these models require subject-specific training and generalize worse when they are applied to new subjects. This is mainly caused by the individual differences across subjects. While, on the other hand, it is time-consuming and high cost to collect subject-specific training data for every new user. How to eliminate the individual differences in EEG signals for implementation of affective models is one of the challenges. In this paper, we apply Deep adaptation network (DAN) to solve this problem. The performance is evaluated on two publicly available EEG emotion recognition datasets, SEED and SEED-IV, in comparison with two baseline methods without domain adaptation and several other domain adaptation methods. The experimental results indicate that the performance of DAN is significantly superior to the existing methods.
引用
收藏
页码:403 / 413
页数:11
相关论文
共 50 条
  • [21] Multisource Transfer Learning for Cross-Subject EEG Emotion Recognition
    Li, Jinpeng
    Qiu, Shuang
    Shen, Yuan-Yuan
    Liu, Cheng-Lin
    He, Huiguang
    IEEE TRANSACTIONS ON CYBERNETICS, 2020, 50 (07) : 3281 - 3293
  • [22] Dynamic Domain Adaptation for Class-Aware Cross-Subject and Cross-Session EEG Emotion Recognition
    Li, Zhunan
    Zhu, Enwei
    Jin, Ming
    Fan, Cunhang
    He, Huiguang
    Cai, Ting
    Li, Jinpeng
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2022, 26 (12) : 5964 - 5973
  • [23] Personality first in emotion: a deep neural network based on electroencephalogram channel attention for cross-subject emotion recognition
    Tian, Zhihang
    Huang, Dongmin
    Zhou, Sijin
    Zhao, Zhidan
    Jiang, Dazhi
    ROYAL SOCIETY OPEN SCIENCE, 2021, 8 (08):
  • [24] Generator-based Domain Adaptation Method with Knowledge Free for Cross-subject EEG Emotion Recognition
    Dongmin Huang
    Sijin Zhou
    Dazhi Jiang
    Cognitive Computation, 2022, 14 : 1316 - 1327
  • [25] Generator-based Domain Adaptation Method with Knowledge Free for Cross-subject EEG Emotion Recognition
    Huang, Dongmin
    Zhou, Sijin
    Jiang, Dazhi
    COGNITIVE COMPUTATION, 2022, 14 (04) : 1316 - 1327
  • [26] Multi-source Selective Graph Domain Adaptation Network for cross-subject EEG emotion recognition
    Wang, Jing
    Ning, Xiaojun
    Xu, Wei
    Li, Yunze
    Jia, Ziyu
    Lin, Youfang
    Neural Networks, 2024, 180
  • [27] Learning a robust unified domain adaptation framework for cross-subject EEG-based emotion recognition
    Jimenez-Guarneros, Magdiel
    Fuentes-Pineda, Gibran
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 86
  • [28] Capsule neural networks on spatio-temporal EEG frames for cross-subject emotion recognition
    Jana, Gopal Chandra
    Sabath, Anshuman
    Agrawal, Anupam
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 72
  • [29] Cross-Subject EEG-Based Emotion Recognition Through Neural Networks With Stratified Normalization
    Fdez, Javier
    Guttenberg, Nicholas
    Witkowski, Olaf
    Pasquali, Antoine
    FRONTIERS IN NEUROSCIENCE, 2021, 15
  • [30] Emotion recognition while applying cosmetic cream using deep learning from EEG data; cross-subject analysis
    Kim, Jieun
    Hwang, Dong-Uk
    Son, Edwin J.
    Oh, Sang Hoon
    Kim, Whansun
    Kim, Youngkyung
    Kwon, Gusang
    PLOS ONE, 2022, 17 (11):