Emotion Recognition from Multi-Channel EEG Data through Convolutional Recurrent Neural Network

被引:0
|
作者
Li, Xiang [1 ]
Song, Dawei [1 ,2 ]
Zhang, Peng [1 ]
Yu, Guangliang [1 ]
Hou, Yuexian [1 ]
Hu, Bin [3 ]
机构
[1] Tianjin Univ, Tianjin Key Lab Cognit Comp & Applicat, Tianjin, Peoples R China
[2] Open Univ, Sch Comp & Commun, Milton Keynes, Bucks, England
[3] Lanzhou Univ, Sch Informat Sci & Engn, Lanzhou 730000, Peoples R China
关键词
CNN; EEG; emotion recognition; LSTM; physiological signal;
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Automatic emotion recognition based on multi-channel neurophysiological signals, as a challenging pattern recognition task, is becoming an important computer-aided method for emotional disorder diagnoses in neurology and psychiatry. Traditional approaches require designing and extracting a range of features from single or multiple channel signals based on extensive domain knowledge. This may be an obstacle for non-domain experts. Moreover, traditional feature fusion method can not fully utilize correlation information between different channels. In this paper, we propose a preprocessing method that encapsulates the multi-channel neurophysiological signals into grid-like frames through wavelet and scalogram transform. We further design a hybrid deep learning model that combines the 'Convolutional Neural Network (CNN)' and 'Recurrent Neural Network (RNN)', for extracting task-related features, mining inter-channel correlation and incorporating contextual information from those frames. Experiments are carried out, in a trial-level emotion recognition task, on the DEAP benchmarking dataset. Our results demonstrate the effectiveness of the proposed methods, with respect to the emotional dimensions of Valence and Arousal.
引用
收藏
页码:352 / 359
页数:8
相关论文
共 50 条
  • [31] Emotion Recognition with Multi-Channel EEG Signals Using Visual Stimulus
    Ergin, Tugba
    Ozdemir, Mehmet Akif
    Akan, Aydin
    [J]. 2019 MEDICAL TECHNOLOGIES CONGRESS (TIPTEKNO), 2019, : 475 - 478
  • [32] SAE plus LSTM: A New Framework for Emotion Recognition From Multi-Channel EEG
    Xing, Xiaofen
    Li, Zhenqi
    Xu, Tianyuan
    Shu, Lin
    Hue, Bin
    Xu, Xiangmin
    [J]. FRONTIERS IN NEUROROBOTICS, 2019, 13
  • [33] EEG-based emotion recognition using 4D convolutional recurrent neural network
    Shen, Fangyao
    Dai, Guojun
    Lin, Guang
    Zhang, Jianhai
    Kong, Wanzeng
    Zeng, Hong
    [J]. COGNITIVE NEURODYNAMICS, 2020, 14 (06) : 815 - 828
  • [34] Multidimensional Feature in Emotion Recognition Based on Multi-Channel EEG Signals
    Li, Qi
    Liu, Yunqing
    Liu, Quanyang
    Zhang, Qiong
    Yan, Fei
    Ma, Yimin
    Zhang, Xinyu
    [J]. ENTROPY, 2022, 24 (12)
  • [35] Emotion Recognition with Multi-Channel EEG Signals Using Auditory Stimulus
    Gunes, Cansu
    Ozdemir, Mehmet Akif
    Akan, Aydin
    [J]. 2019 MEDICAL TECHNOLOGIES CONGRESS (TIPTEKNO), 2019, : 271 - 274
  • [36] EEG-based emotion recognition using 4D convolutional recurrent neural network
    Fangyao Shen
    Guojun Dai
    Guang Lin
    Jianhai Zhang
    Wanzeng Kong
    Hong Zeng
    [J]. Cognitive Neurodynamics, 2020, 14 : 815 - 828
  • [37] Emotion recognition by deeply learned multi-channel textual and EEG features
    Liu, Yishu
    Fu, Guifang
    [J]. FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2021, 119 : 1 - 6
  • [38] Emotion recognition with convolutional neural network and EEG-based EFDMs
    Wang, Fei
    Wu, Shichao
    Zhang, Weiwei
    Xu, Zongfeng
    Zhang, Yahui
    Wu, Chengdong
    Coleman, Sonya
    [J]. NEUROPSYCHOLOGIA, 2020, 146
  • [39] EEG-based emotion recognition with cascaded convolutional recurrent neural networks
    Meng, Ming
    Zhang, Yu
    Ma, Yuliang
    Gao, Yunyuan
    Kong, Wanzeng
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2023, 26 (02) : 783 - 795
  • [40] EEG-based emotion recognition with cascaded convolutional recurrent neural networks
    Ming Meng
    Yu Zhang
    Yuliang Ma
    Yunyuan Gao
    Wanzeng Kong
    [J]. Pattern Analysis and Applications, 2023, 26 : 783 - 795