Enhancing cross-subject EEG emotion recognition through multi-source manifold metric transfer learning

被引:0
|
作者
Shi X. [1 ]
She Q. [1 ,4 ]
Fang F. [2 ]
Meng M. [1 ,4 ]
Tan T. [3 ]
Zhang Y. [2 ]
机构
[1] School of Automation, Hangzhou Dianzi University, Zhejiang, Hangzhou
[2] Department of Biomedical Engineering, University of Miami, Coral Gables, FL
[3] Department of Rehabilitation, Medicine, Zhejiang Provincial People's Hospital, People's Hospital of Hangzhou Medical College, Zhejiang, Hangzhou
[4] International Joint Research Laboratory for Autonomous Robotic Systems, Zhejiang, Hangzhou
基金
中国国家自然科学基金;
关键词
Affective brain-computer interface (aBCI); Electroencephalogram (EEG); Emotion recognition; Metric transfer learning;
D O I
10.1016/j.compbiomed.2024.108445
中图分类号
学科分类号
摘要
Transfer learning (TL) has demonstrated its efficacy in addressing the cross-subject domain adaptation challenges in affective brain-computer interfaces (aBCI). However, previous TL methods usually use a stationary distance, such as Euclidean distance, to quantify the distribution dissimilarity between two domains, overlooking the inherent links among similar samples, potentially leading to suboptimal feature mapping. In this study, we introduced a novel algorithm called multi-source manifold metric transfer learning (MSMMTL) to enhance the efficacy of conventional TL. Specifically, we first selected the source domain based on Mahalanobis distance to enhance the quality of the source domains and then used manifold feature mapping approach to map the source and target domains on the Grassmann manifold to mitigate data drift between domains. In this newly established shared space, we optimized the Mahalanobis metric by maximizing the inter-class distances while minimizing the intra-class distances in the target domain. Recognizing that significant distribution discrepancies might persist across different domains even on the manifold, to ensure similar distributions between the source and target domains, we further imposed constraints on both domains under the Mahalanobis metric. This approach aims to reduce distributional disparities and enhance the electroencephalogram (EEG) emotion recognition performance. In cross-subject experiments, the MSMMTL model exhibits average classification accuracies of 88.83 % and 65.04 % for SEED and DEAP, respectively, underscoring the superiority of our proposed MSMMTL over other state-of-the-art methods. MSMMTL can effectively solve the problem of individual differences in EEG-based affective computing. © 2024 Elsevier Ltd
引用
收藏
相关论文
共 50 条
  • [31] Spatial-Temporal Constraint Learning for Cross-Subject EEG-Based Emotion Recognition
    Li, Wei
    Hou, Bowen
    Shao, Shitong
    Huan, Wei
    Tian, Ye
    2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [32] Manifold Feature Fusion with Dynamical Feature Selection for Cross-Subject Emotion Recognition
    Hua, Yue
    Zhong, Xiaolong
    Zhang, Bingxue
    Yin, Zhong
    Zhang, Jianhua
    BRAIN SCIENCES, 2021, 11 (11)
  • [33] Cross-Subject EEG-Based Emotion Recognition Through Neural Networks With Stratified Normalization
    Fdez, Javier
    Guttenberg, Nicholas
    Witkowski, Olaf
    Pasquali, Antoine
    FRONTIERS IN NEUROSCIENCE, 2021, 15
  • [34] FMLAN: A novel framework for cross-subject and cross-session EEG emotion recognition
    Yu, Peng
    He, Xiaopeng
    Li, Haoyu
    Dou, Haowen
    Tan, Yeyu
    Wu, Hao
    Chen, Badong
    Biomedical Signal Processing and Control, 2025, 100
  • [35] From Intricacy to Conciseness: A Progressive Transfer Strategy for EEG-Based Cross-Subject Emotion Recognition
    Cai, Ziliang
    Wang, Lingyue
    Guo, Miaomiao
    Xu, Guizhi
    Guo, Lei
    Li, Ying
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2022, 32 (03)
  • [36] Multi-Class Transfer Learning and Domain Selection for Cross-Subject EEG Classification
    Maswanganyi, Rito Clifford
    Tu, Chungling
    Owolawi, Pius Adewale
    Du, Shengzhi
    APPLIED SCIENCES-BASEL, 2023, 13 (08):
  • [37] Study on Driver Cross-Subject Emotion Recognition Based on Raw Multi-Channels EEG Data
    Wang, Zhirong
    Chen, Ming
    Feng, Guofu
    ELECTRONICS, 2023, 12 (11)
  • [38] Cross-Subject emotion recognition from EEG using Convolutional Neural Networks
    Zhong, Xiaolong
    Yin, Zhong
    Zhang, Jianhua
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 7516 - 7521
  • [39] Cross-Subject EEG-Based Emotion Recognition with Deep Domain Confusion
    Zhang, Weiwei
    Wang, Fei
    Jiang, Yang
    Xu, Zongfeng
    Wu, Shichao
    Zhang, Yahui
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2019, PT I, 2019, 11740 : 558 - 570
  • [40] Learning a robust unified domain adaptation framework for cross-subject EEG-based emotion recognition
    Jimenez-Guarneros, Magdiel
    Fuentes-Pineda, Gibran
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 86