Multimodal Real-Time patient emotion recognition system using facial expressions and brain EEG signals based on Machine learning and Log-Sync methods

被引:3
|
作者
Mutawa, A. M. [1 ]
Hassouneh, Aya [1 ]
机构
[1] Kuwait Univ, Coll Engn & Petr, Dept Comp Engn, Kuwait, Kuwait
关键词
EEG; Multi-modal; Neural network; Log-sync; Face recognition; Emotion recognition; Hospitalized patients;
D O I
10.1016/j.bspc.2023.105942
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Human Machine Interface (HMI) depends on emotion detection, especially for hospitalized patients. The emergence of the fourth industrial revolution (4IR) has heightened the interest in emotional intelligence in human-computer interaction (HCI). This work employs electroencephalography (EEG), an optical flow algorithm, and machine learning to create a multimodal intelligent real-time emotion recognition system. The objective is to assist hospitalized patients, disabled (deaf, mute, and bedridden) individuals, and autistic youngsters in expressing their authentic feelings. We fed our multimodal feature fusion vector to a classifier with long short-term memory (LSTM). We distinguished six fundamental emotions: anger, disgust, fear, sadness, joy, and surprise. The fusion feature vector was created utilizing the patient's geometric facial characteristics and EEG inputs. Utilizing 14 EEG inputs, we used four-band relative power channels, namely alpha (8-13 Hz), beta (13-30 Hz), gamma (30-49 Hz), and theta (4-8 Hz). We achieved a maximum recognition rate of 90.25 percent using just facial landmarks and 87.25 percent using only EEG data. When both facial and EEG streams were examined, we achieved 99.3 percent accuracy in a multimodal method.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] A Real-Time and Two-Dimensional Emotion Recognition System Based on EEG and HRV using Machine Learning
    Wei, Yongxin
    Li, Yunfan
    Xu, Mingyang
    Hua, Yifan
    Gong, Yukai
    Osawa, Keisuke
    Tanaka, Eiichiro
    2023 IEEE/SICE INTERNATIONAL SYMPOSIUM ON SYSTEM INTEGRATION, SII, 2023,
  • [2] Machine-Learning-Based Emotion Recognition System Using EEG Signals
    Alhalaseh, Rania
    Alasasfeh, Suzan
    COMPUTERS, 2020, 9 (04) : 1 - 15
  • [3] Using Kinect for real-time emotion recognition via facial expressions
    Qi-rong Mao
    Xin-yu Pan
    Yong-zhao Zhan
    Xiang-jun Shen
    Frontiers of Information Technology & Electronic Engineering, 2015, 16 : 272 - 282
  • [4] Using Kinect for real-time emotion recognition via facial expressions
    Mao, Qi-rong
    Pan, Xin-yu
    Zhan, Yong-zhao
    Shen, Xiang-jun
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2015, 16 (04) : 272 - 282
  • [5] Video-based multimodal spontaneous emotion recognition using facial expressions and physiological signals
    Ouzar, Yassine
    Bousefsaf, Frederic
    Djeldjli, Djamaleddine
    Maaoui, Choubeila
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2022, 2022, : 2459 - 2468
  • [6] Emotion Recognition Using Time-frequency Analysis of EEG Signals and Machine Learning
    Zhang, Jianhua
    Chen, Peng
    Nichele, Stefano
    Yazidi, Anis
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 404 - 409
  • [7] Emotion Recognition System via Facial Expressions and Speech Using Machine Learning and Deep Learning Techniques
    Chaudhari A.
    Bhatt C.
    Nguyen T.T.
    Patel N.
    Chavda K.
    Sarda K.
    SN Computer Science, 4 (4)
  • [8] A comparative analysis of machine learning methods for emotion recognition using EEG and peripheral physiological signals
    Vikrant Doma
    Matin Pirouz
    Journal of Big Data, 7
  • [9] A comparative analysis of machine learning methods for emotion recognition using EEG and peripheral physiological signals
    Doma, Vikrant
    Pirouz, Matin
    JOURNAL OF BIG DATA, 2020, 7 (01)
  • [10] A Real-Time Model Based Support Vector Machine for Emotion Recognition Through EEG
    Viet Hoang Anh
    Manh Ngo Van
    Bang Ban Ha
    Thang Huynh Quyet
    2012 INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND INFORMATION SCIENCES (ICCAIS), 2012, : 191 - 196