Research on Emotion Recognition Method of Flight Training Based on Multimodal Fusion

被引:2
|
作者
Wang, Wendong [1 ]
Zhang, Haoyang [1 ]
Zhang, Zhibin [1 ]
机构
[1] Northwestern Polytech Univ, Sch Mech Engn, Xian, Peoples R China
关键词
Emotion recognition; intelligent perception; adaptive dynamic fusion; multimodal fusion; ALGORITHMS; FEATURES; MACHINE;
D O I
10.1080/10447318.2023.2254644
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The emotional activities of the human body are mainly regulated by the autonomic nervous system, the central nervous system, and the advanced cognition of the human brain. This paper proposes an emotional state recognition method in pilot training tasks based on multimodal information fusion. A set of emotion perception recognition systems, a two-dimensional valence-arousal emotional model, and a multimodal information intelligent perception model were established based on the human-computer interaction mode during flight training; an intelligent perception system was designed to collect and intelligently perceive four kinds of peripheral physiological signals of pilots in real-time. And based on traditional machine learning models, a binary tree support vector machine was designed to optimize and improve the multimodal information co-integration decision model, which increased the accuracy of emotional state recognition in flight training by 37.58% on average. The experimental result showed that it realizes accurate monitoring and identification of real-time emotional state, helps to improve the effect of flight training and flight safety, maintains the efficiency of operation, and has important research significance and application prospects in the field of pilot training.
引用
收藏
页码:6478 / 6491
页数:14
相关论文
共 50 条
  • [21] Editorial for the special issue on "Research on methods of multimodal information fusion in emotion recognition"
    Xia, Kaijian
    Hu, Tao
    Si, Wen
    PERSONAL AND UBIQUITOUS COMPUTING, 2019, 23 (3-4) : 359 - 361
  • [22] MULTIMODAL EMOTION RECOGNITION WITH CAPSULE GRAPH CONVOLUTIONAL BASED REPRESENTATION FUSION
    Liu, Jiaxing
    Chen, Sen
    Wang, Longbiao
    Liu, Zhilei
    Fu, Yahui
    Guo, Lili
    Dang, Jianwu
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 6339 - 6343
  • [23] Cross-Subject Multimodal Emotion Recognition Based on Hybrid Fusion
    Cimtay, Yucel
    Ekmekcioglu, Erhan
    Caglar-Ozhan, Seyma
    IEEE ACCESS, 2020, 8 : 168865 - 168878
  • [24] Real-time music emotion recognition based on multimodal fusion
    Hao, Xingye
    Li, Honghe
    Wen, Yonggang
    ALEXANDRIA ENGINEERING JOURNAL, 2025, 116 : 586 - 600
  • [25] An Improved Multimodal Dimension Emotion Recognition Based on Different Fusion Methods
    Su, Haiyang
    Liu, Bin
    Tao, Jianhua
    Dong, Yongfeng
    Huang, Jian
    Lian, Zheng
    Song, Leichao
    PROCEEDINGS OF 2020 IEEE 15TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP 2020), 2020, : 257 - 261
  • [26] Rapid recognition of athlete's anxiety emotion based on multimodal fusion
    Wang, Li
    INTERNATIONAL JOURNAL OF BIOMETRICS, 2024, 16 (05) : 449 - 462
  • [27] Research on a Microexpression Recognition Technology Based on Multimodal Fusion
    Kang, Jie
    Chen, Xiao Ying
    Liu, Qi Yuan
    Jin, Si Han
    Yang, Cheng Han
    Hu, Cong
    COMPLEXITY, 2021, 2021
  • [28] Research on Gait Recognition Based on GaitSet and Multimodal Fusion
    Shi, Xiling
    Zhao, Wenqiang
    Pei, Huandou
    Zhai, Hongru
    Gao, Yongxia
    IEEE ACCESS, 2025, 13 : 20017 - 20024
  • [29] A multimodal emotion recognition method based on facial expressions and electroencephalography
    Tan, Ying
    Sun, Zhe
    Duan, Feng
    Sole-Casals, Jordi
    Caiafa, Cesar F.
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2021, 70
  • [30] Multimodal emotion recognition algorithm based on edge network emotion element compensation and data fusion
    Yu Wang
    Personal and Ubiquitous Computing, 2019, 23 : 383 - 392