Emotion Recognition From Multimodal Physiological Signals via Discriminative Correlation Fusion With a Temporal Alignment Mechanism

被引:8
|
作者
Hou, Kechen [1 ]
Zhang, Xiaowei [1 ]
Yang, Yikun [1 ]
Zhao, Qiqi [1 ]
Yuan, Wenjie [1 ]
Zhou, Zhongyi [1 ]
Zhang, Sipo [1 ]
Li, Chen [1 ]
Shen, Jian [2 ]
Hu, Bin [2 ]
机构
[1] Lanzhou Univ, Sch Informat Sci & Engn, Gansu Prov Key Lab Wearable Comp, Lanzhou 730000, Peoples R China
[2] Beijing Inst Technol, Sch Med Technol, Beijing 10081, Peoples R China
基金
中国国家自然科学基金;
关键词
Coordinated fusion; emotion recognition; nervous systems; temporal alignment; HEART-RATE-VARIABILITY; CINGULATE CORTEX; BRAIN;
D O I
10.1109/TCYB.2023.3320107
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Modeling correlations between multimodal physiological signals e.g., canonical correlation analysis (CCA) for emotion recognition has attracted much attention. However, existing studies rarely consider the neural nature of emotional responses within physiological signals. Furthermore, during fusion space construction, the CCA method maximizes only the correlations between different modalities and neglects the discriminative information of different emotional states. Most importantly, temporal mismatches between different neural activities are often ignored; therefore, the theoretical assumptions that multimodal data should be aligned in time and space before fusion are not fulfilled. To address these issues, we propose a discriminative correlation fusion method coupled with a temporal alignment mechanism for multimodal physiological signals. We first use neural signal analysis techniques to construct neural representations of the central nervous system (CNS) and autonomic nervous system (ANS). respectively. Then, emotion class labels are introduced in CCA to obtain more discriminative fusion representations from multimodal neural responses, and the temporal alignment between the CNS and ANS is jointly optimized with a fusion procedure that applies the Bayesian algorithm. The experimental results demonstrate that our method significantly improves the emotion recognition performance. Additionally, we show that this fusion method can model the underlying mechanisms in human nervous systems during emotional responses, and our results are consistent with prior findings. This study may guide a new approach for exploring human cognitive function based on physiological signals at different time scales and promote the development of computational intelligence and harmonious human-computer interactions.
引用
收藏
页码:3079 / 3092
页数:14
相关论文
共 50 条
  • [41] Emotion Recognition from Physiological Signals Using AdaBoost
    Cheng, Bo
    APPLIED INFORMATICS AND COMMUNICATION, PT I, 2011, 224 : 412 - 417
  • [42] Emotion Recognition from Physiological Signals Using AdaBoost
    Cheng, Bo
    2010 THE 3RD INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND INDUSTRIAL APPLICATION (PACIIA2010), VOL I, 2010, : 233 - 235
  • [43] Enhancing emotion recognition using multimodal fusion of physiological, environmental, personal data
    Kim, Hakpyeong
    Hong, Taehoon
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 249
  • [44] Multimodal emotion recognition based on the fusion of vision, EEG, ECG, and EMG signals
    Bhatlawande, Shripad
    Pramanik, Sourjadip
    Shilaskar, Swati
    Sole, Swarali
    INTERNATIONAL JOURNAL OF ELECTRICAL AND COMPUTER ENGINEERING SYSTEMS, 2024, 15 (01) : 41 - 58
  • [45] Multimodal Emotion Recognition From EEG Signals and Facial Expressions
    Wang, Shuai
    Qu, Jingzi
    Zhang, Yong
    Zhang, Yidie
    IEEE ACCESS, 2023, 11 : 33061 - 33068
  • [46] Group Gated Fusion on Attention-based Bidirectional Alignment for Multimodal Emotion Recognition
    Liu, Pengfei
    Li, Kun
    Meng, Helen
    INTERSPEECH 2020, 2020, : 379 - 383
  • [47] Feature Fusion for Multimodal Emotion Recognition Based on Deep Canonical Correlation Analysis
    Zhang, Ke
    Li, Yuanqing
    Wang, Jingyu
    Wang, Zhen
    Li, Xuelong
    IEEE SIGNAL PROCESSING LETTERS, 2021, 28 : 1898 - 1902
  • [48] Negative emotion recognition using multimodal physiological signals for advanced driver assistance systems
    Chie Hieida
    Tomoaki Yamamoto
    Takatomi Kubo
    Junichiro Yoshimoto
    Kazushi Ikeda
    Artificial Life and Robotics, 2023, 28 : 388 - 393
  • [49] A Convolution Neural Network Based Emotion Recognition System using Multimodal Physiological Signals
    Yang, Cheng-Jie
    Fahier, Nicolas
    Li, Wei-Chih
    Fang, Wai-Chi
    2020 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TAIWAN), 2020,
  • [50] Negative emotion recognition using multimodal physiological signals for advanced driver assistance systems
    Hieida, Chie
    Yamamoto, Tomoaki
    Kubo, Takatomi
    Yoshimoto, Junichiro
    Ikeda, Kazushi
    ARTIFICIAL LIFE AND ROBOTICS, 2023, 28 (02) : 388 - 393