Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition

被引:142
|
作者
Cimtay, Yucel [1 ]
Ekmekcioglu, Erhan [1 ]
机构
[1] Loughborough Univ London, Inst Digital Technol, London E20 3BS, England
关键词
EEG; emotion recognition; pretrained models; convolutional neural network; dense layer; subject independency; dataset independency; raw data; filtering on output; FEATURES; KERNEL;
D O I
10.3390/s20072034
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The electroencephalogram (EEG) has great attraction in emotion recognition studies due to its resistance to deceptive actions of humans. This is one of the most significant advantages of brain signals in comparison to visual or speech signals in the emotion recognition context. A major challenge in EEG-based emotion recognition is that EEG recordings exhibit varying distributions for different people as well as for the same person at different time instances. This nonstationary nature of EEG limits the accuracy of it when subject independency is the priority. The aim of this study is to increase the subject-independent recognition accuracy by exploiting pretrained state-of-the-art Convolutional Neural Network (CNN) architectures. Unlike similar studies that extract spectral band power features from the EEG readings, raw EEG data is used in our study after applying windowing, pre-adjustments and normalization. Removing manual feature extraction from the training system overcomes the risk of eliminating hidden features in the raw data and helps leverage the deep neural network's power in uncovering unknown features. To improve the classification accuracy further, a median filter is used to eliminate the false detections along a prediction interval of emotions. This method yields a mean cross-subject accuracy of 86.56% and 78.34% on the Shanghai Jiao Tong University Emotion EEG Dataset (SEED) for two and three emotion classes, respectively. It also yields a mean cross-subject accuracy of 72.81% on the Database for Emotion Analysis using Physiological Signals (DEAP) and 81.8% on the Loughborough University Multimodal Emotion Dataset (LUMED) for two emotion classes. Furthermore, the recognition model that has been trained using the SEED dataset was tested with the DEAP dataset, which yields a mean prediction accuracy of 58.1% across all subjects and emotion classes. Results show that in terms of classification accuracy, the proposed approach is superior to, or on par with, the reference subject-independent EEG emotion recognition studies identified in literature and has limited complexity due to the elimination of the need for feature extraction.
引用
收藏
页数:20
相关论文
共 50 条
  • [41] Cross-Subject Emotion Recognition Using Flexible Analytic Wavelet Transform From EEG Signals
    Gupta, Vipin
    Chopda, Mayur Dahyabhai
    Pachori, Ram Bilas
    [J]. IEEE SENSORS JOURNAL, 2019, 19 (06) : 2266 - 2274
  • [42] Plug-and-Play Domain Adaptation for Cross-Subject EEG-based Emotion Recognition
    Zhao, Li-Ming
    Yan, Xu
    Lu, Bao-Liang
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 863 - 870
  • [43] Cross-Subject EEG Feature Selection for Emotion Recognition Using Transfer Recursive Feature Elimination
    Yin, Zhong
    Wang, Yongxiong
    Liu, Li
    Zhang, Wei
    Zhang, Jianhua
    [J]. FRONTIERS IN NEUROROBOTICS, 2017, 11
  • [44] Generalized Contrastive Partial Label Learning for Cross-Subject EEG-Based Emotion Recognition
    Li, Wei
    Fan, Lingmin
    Shao, Shitong
    Song, Aiguo
    [J]. IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73
  • [45] Cross-subject EEG emotion recognition combined with connectivity features and meta-transfer learning
    Li, Jinyu
    Hua, Haoqiang
    Xu, Zhihui
    Shu, Lin
    Xu, Xiangmin
    Kuang, Feng
    Wu, Shibin
    [J]. COMPUTERS IN BIOLOGY AND MEDICINE, 2022, 145
  • [46] Spatial-Temporal Constraint Learning for Cross-Subject EEG-Based Emotion Recognition
    Li, Wei
    Hou, Bowen
    Shao, Shitong
    Huan, Wei
    Tian, Ye
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [47] Gusa: Graph-Based Unsupervised Subdomain Adaptation for Cross-Subject EEG Emotion Recognition
    Li, Xiaojun
    Chen, C. L. Philip
    Chen, Bianna
    Zhang, Tong
    [J]. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING, 2024, 15 (03) : 1451 - 1462
  • [48] Cross-Subject Emotion Recognition Using Deep Adaptation Networks
    Li, He
    Jin, Yi-Ming
    Zheng, Wei-Long
    Lu, Bao-Liang
    [J]. NEURAL INFORMATION PROCESSING (ICONIP 2018), PT V, 2018, 11305 : 403 - 413
  • [49] Cross-subject emotion recognition using visibility graph and genetic algorithm-based convolution neural network
    Cai, Qing
    An, Jian-Peng
    Li, Hao-Yu
    Guo, Jia-Yi
    Gao, Zhong-Ke
    [J]. CHAOS, 2022, 32 (09)
  • [50] EEG-based Cross-subject Mental Fatigue Recognition
    Liu, Yisi
    Lan, Zirui
    Cui, Jian
    Sourina, Olga
    Muller-Wittig, Wolfgang
    [J]. 2019 INTERNATIONAL CONFERENCE ON CYBERWORLDS (CW), 2019, : 247 - 252