Emotion Recognition From Multimodal Physiological Signals via Discriminative Correlation Fusion With a Temporal Alignment Mechanism

被引:8
|
作者
Hou, Kechen [1 ]
Zhang, Xiaowei [1 ]
Yang, Yikun [1 ]
Zhao, Qiqi [1 ]
Yuan, Wenjie [1 ]
Zhou, Zhongyi [1 ]
Zhang, Sipo [1 ]
Li, Chen [1 ]
Shen, Jian [2 ]
Hu, Bin [2 ]
机构
[1] Lanzhou Univ, Sch Informat Sci & Engn, Gansu Prov Key Lab Wearable Comp, Lanzhou 730000, Peoples R China
[2] Beijing Inst Technol, Sch Med Technol, Beijing 10081, Peoples R China
基金
中国国家自然科学基金;
关键词
Coordinated fusion; emotion recognition; nervous systems; temporal alignment; HEART-RATE-VARIABILITY; CINGULATE CORTEX; BRAIN;
D O I
10.1109/TCYB.2023.3320107
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Modeling correlations between multimodal physiological signals e.g., canonical correlation analysis (CCA) for emotion recognition has attracted much attention. However, existing studies rarely consider the neural nature of emotional responses within physiological signals. Furthermore, during fusion space construction, the CCA method maximizes only the correlations between different modalities and neglects the discriminative information of different emotional states. Most importantly, temporal mismatches between different neural activities are often ignored; therefore, the theoretical assumptions that multimodal data should be aligned in time and space before fusion are not fulfilled. To address these issues, we propose a discriminative correlation fusion method coupled with a temporal alignment mechanism for multimodal physiological signals. We first use neural signal analysis techniques to construct neural representations of the central nervous system (CNS) and autonomic nervous system (ANS). respectively. Then, emotion class labels are introduced in CCA to obtain more discriminative fusion representations from multimodal neural responses, and the temporal alignment between the CNS and ANS is jointly optimized with a fusion procedure that applies the Bayesian algorithm. The experimental results demonstrate that our method significantly improves the emotion recognition performance. Additionally, we show that this fusion method can model the underlying mechanisms in human nervous systems during emotional responses, and our results are consistent with prior findings. This study may guide a new approach for exploring human cognitive function based on physiological signals at different time scales and promote the development of computational intelligence and harmonious human-computer interactions.
引用
收藏
页码:3079 / 3092
页数:14
相关论文
共 50 条
  • [31] Research on Feature Fusion for Emotion Recognition Based on Discriminative Canonical Correlation Analysis
    ChuqiLiu
    Li, Chao
    ZipingZhao
    PROCEEDINGS OF 2018 INTERNATIONAL CONFERENCE ON MATHEMATICS AND ARTIFICIAL INTELLIGENCE (ICMAI 2018), 2018, : 30 - 36
  • [32] TAGformer: A Multimodal Physiological Signals Fusion Network for Pilot Stress Recognition
    Wang, Shaofan
    Li, Yuangan
    Zhang, Tao
    Li, Ke
    IEEE SENSORS JOURNAL, 2024, 24 (13) : 20842 - 20854
  • [33] Discriminative-CCA Promoted By EEG signals For Physiological-based Emotion Recognition
    Zhao, Wenping
    Zhao, Ziping
    Li, Chao
    2018 FIRST ASIAN CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION (ACII ASIA), 2018,
  • [34] Multimodal Fusion of Physiological Signals and Facial Action Units for Pain Recognition
    Hinduja, Saurabh
    Canavan, Shaun
    Kaur, Gurmeet
    2020 15TH IEEE INTERNATIONAL CONFERENCE ON AUTOMATIC FACE AND GESTURE RECOGNITION (FG 2020), 2020, : 577 - 581
  • [35] Emotion Recognition Based on Weighted Fusion Strategy of Multichannel Physiological Signals
    Wei, Wei
    Jia, Qingxuan
    Feng, Yongli
    Chen, Gang
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2018, 2018
  • [36] A novel feature fusion network for multimodal emotion recognition from EEG and eye movement signals
    Fu, Baole
    Gu, Chunrui
    Fu, Ming
    Xia, Yuxiao
    Liu, Yinhua
    FRONTIERS IN NEUROSCIENCE, 2023, 17
  • [37] Multimodal Stability-Sensitive Emotion Recognition based on Brainwave and Physiological Signals
    Thammasan, Nattapong
    Hagad, Juan Lorenzo
    Fukui, Ken-ichi
    Numao, Masayuki
    2017 SEVENTH INTERNATIONAL CONFERENCE ON AFFECTIVE COMPUTING AND INTELLIGENT INTERACTION WORKSHOPS AND DEMOS (ACIIW), 2017, : 44 - 49
  • [38] Multimodal Emotion Recognition by Combining Physiological Signals and Facial Expressions: A Preliminary Study
    Kortelainen, Jukka
    Tiinanen, Suvi
    Huang, Xiaohua
    Li, Xiaobai
    Laukka, Seppo
    Pietikainen, Matti
    Seppanen, Tapio
    2012 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2012, : 5238 - 5241
  • [39] Emotion Recognition Based on Multi-Variant Correlation of Physiological Signals
    Wen, Wanhui
    Liu, Guangyuan
    Cheng, Nanpu
    Wei, Jie
    Shangguan, Pengchao
    Huang, Wenjin
    IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2014, 5 (02) : 126 - 140
  • [40] Emotion Recognition from Physiological Signals Based on ASAGA
    Zhou, Lianzhe
    Pang, Huanli
    Liu, Hanmei
    PROCEEDINGS OF THE 2012 INTERNATIONAL CONFERENCE ON COMMUNICATION, ELECTRONICS AND AUTOMATION ENGINEERING, 2013, 181 : 735 - 740