Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition

被引:142
|
作者
Cimtay, Yucel [1 ]
Ekmekcioglu, Erhan [1 ]
机构
[1] Loughborough Univ London, Inst Digital Technol, London E20 3BS, England
关键词
EEG; emotion recognition; pretrained models; convolutional neural network; dense layer; subject independency; dataset independency; raw data; filtering on output; FEATURES; KERNEL;
D O I
10.3390/s20072034
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The electroencephalogram (EEG) has great attraction in emotion recognition studies due to its resistance to deceptive actions of humans. This is one of the most significant advantages of brain signals in comparison to visual or speech signals in the emotion recognition context. A major challenge in EEG-based emotion recognition is that EEG recordings exhibit varying distributions for different people as well as for the same person at different time instances. This nonstationary nature of EEG limits the accuracy of it when subject independency is the priority. The aim of this study is to increase the subject-independent recognition accuracy by exploiting pretrained state-of-the-art Convolutional Neural Network (CNN) architectures. Unlike similar studies that extract spectral band power features from the EEG readings, raw EEG data is used in our study after applying windowing, pre-adjustments and normalization. Removing manual feature extraction from the training system overcomes the risk of eliminating hidden features in the raw data and helps leverage the deep neural network's power in uncovering unknown features. To improve the classification accuracy further, a median filter is used to eliminate the false detections along a prediction interval of emotions. This method yields a mean cross-subject accuracy of 86.56% and 78.34% on the Shanghai Jiao Tong University Emotion EEG Dataset (SEED) for two and three emotion classes, respectively. It also yields a mean cross-subject accuracy of 72.81% on the Database for Emotion Analysis using Physiological Signals (DEAP) and 81.8% on the Loughborough University Multimodal Emotion Dataset (LUMED) for two emotion classes. Furthermore, the recognition model that has been trained using the SEED dataset was tested with the DEAP dataset, which yields a mean prediction accuracy of 58.1% across all subjects and emotion classes. Results show that in terms of classification accuracy, the proposed approach is superior to, or on par with, the reference subject-independent EEG emotion recognition studies identified in literature and has limited complexity due to the elimination of the need for feature extraction.
引用
收藏
页数:20
相关论文
共 50 条
  • [31] EEG Feature Selection for Emotion Recognition Based on Cross-subject Recursive Feature Elimination
    Zhang, Wei
    Yin, Zhong
    [J]. PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 6256 - 6261
  • [32] Functional connectivity-enhanced feature-grouped attention network for cross-subject EEG emotion recognition
    Guo, Wenhui
    Li, Yaxuan
    Liu, Mengxue
    Ma, Rui
    Wang, Yanjiang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2024, 283
  • [33] Temporal aware Mixed Attention-based Convolution and Transformer Network for cross-subject EEG emotion recognition
    Si, Xiaopeng
    Huang, Dong
    Liang, Zhen
    Sun, Yulin
    Huang, He
    Liu, Qile
    Yang, Zhuobin
    Ming, Dong
    [J]. Computers in Biology and Medicine, 2024, 181
  • [34] Cross-Subject EEG Signal Recognition Using Deep Domain Adaptation Network
    Hang, Wenlong
    Feng, Wei
    Du, Ruoyu
    Liang, Shuang
    Chen, Yan
    Wang, Qiong
    Liu, Xuejun
    [J]. IEEE ACCESS, 2019, 7 : 128273 - 128282
  • [35] Dynamic Domain Adaptation for Class-Aware Cross-Subject and Cross-Session EEG Emotion Recognition
    Li, Zhunan
    Zhu, Enwei
    Jin, Ming
    Fan, Cunhang
    He, Huiguang
    Cai, Ting
    Li, Jinpeng
    [J]. IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2022, 26 (12) : 5964 - 5973
  • [36] Data augmentation for cross-subject EEG features using Siamese neural network
    Fu, Rongrong
    Wang, Yaodong
    Jia, Chengcheng
    [J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 75
  • [37] Deep BiLSTM neural network model for emotion detection using cross-dataset approach
    Joshi, Vaishali M.
    Ghongade, Rajesh B.
    Joshi, Aditi M.
    Kulkarni, Rushikesh V.
    [J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2022, 73
  • [38] Cross-Dataset Data Augmentation for Convolutional Neural Networks Training
    Gasparetto, Andrea
    Ressi, Dalila
    Bergamasco, Filippo
    Pistellato, Mara
    Cosmo, Luca
    Boschetti, Marco
    Ursella, Enrico
    Albarelli, Andrea
    [J]. 2018 24TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2018, : 910 - 915
  • [39] Adversarial Discriminative Domain Adaptation and Transformers for EEG-based Cross-Subject Emotion Recognition
    Sartipi, Shadi
    Cetin, Mujdat
    [J]. 2023 11TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING, NER, 2023,
  • [40] Cross-Subject Multimodal Emotion Recognition Based on Hybrid Fusion
    Cimtay, Yucel
    Ekmekcioglu, Erhan
    Caglar-Ozhan, Seyma
    [J]. IEEE ACCESS, 2020, 8 : 168865 - 168878