Investigating the Use of Pretrained Convolutional Neural Network on Cross-Subject and Cross-Dataset EEG Emotion Recognition

被引:142
|
作者
Cimtay, Yucel [1 ]
Ekmekcioglu, Erhan [1 ]
机构
[1] Loughborough Univ London, Inst Digital Technol, London E20 3BS, England
关键词
EEG; emotion recognition; pretrained models; convolutional neural network; dense layer; subject independency; dataset independency; raw data; filtering on output; FEATURES; KERNEL;
D O I
10.3390/s20072034
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
The electroencephalogram (EEG) has great attraction in emotion recognition studies due to its resistance to deceptive actions of humans. This is one of the most significant advantages of brain signals in comparison to visual or speech signals in the emotion recognition context. A major challenge in EEG-based emotion recognition is that EEG recordings exhibit varying distributions for different people as well as for the same person at different time instances. This nonstationary nature of EEG limits the accuracy of it when subject independency is the priority. The aim of this study is to increase the subject-independent recognition accuracy by exploiting pretrained state-of-the-art Convolutional Neural Network (CNN) architectures. Unlike similar studies that extract spectral band power features from the EEG readings, raw EEG data is used in our study after applying windowing, pre-adjustments and normalization. Removing manual feature extraction from the training system overcomes the risk of eliminating hidden features in the raw data and helps leverage the deep neural network's power in uncovering unknown features. To improve the classification accuracy further, a median filter is used to eliminate the false detections along a prediction interval of emotions. This method yields a mean cross-subject accuracy of 86.56% and 78.34% on the Shanghai Jiao Tong University Emotion EEG Dataset (SEED) for two and three emotion classes, respectively. It also yields a mean cross-subject accuracy of 72.81% on the Database for Emotion Analysis using Physiological Signals (DEAP) and 81.8% on the Loughborough University Multimodal Emotion Dataset (LUMED) for two emotion classes. Furthermore, the recognition model that has been trained using the SEED dataset was tested with the DEAP dataset, which yields a mean prediction accuracy of 58.1% across all subjects and emotion classes. Results show that in terms of classification accuracy, the proposed approach is superior to, or on par with, the reference subject-independent EEG emotion recognition studies identified in literature and has limited complexity due to the elimination of the need for feature extraction.
引用
收藏
页数:20
相关论文
共 50 条
  • [21] Cross-Subject Emotion Recognition Based on Domain Similarity of EEG Signal Transfer
    Ma, Yuliang
    Zhao, Weicheng
    Meng, Ming
    Zhang, Qizhong
    She, Qingshan
    Zhang, Jianhai
    [J]. IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2023, 31 : 936 - 943
  • [22] Cross-Subject EEG-Based Emotion Recognition with Deep Domain Confusion
    Zhang, Weiwei
    Wang, Fei
    Jiang, Yang
    Xu, Zongfeng
    Wu, Shichao
    Zhang, Yahui
    [J]. INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2019, PT I, 2019, 11740 : 558 - 570
  • [23] Comprehensive Multisource Learning Network for Cross-Subject Multimodal Emotion Recognition
    Chen, Chuangquan
    Li, Zhencheng
    Kou, Kit Ian
    Du, Jie
    Li, Chen
    Wang, Hongtao
    Vong, Chi-Man
    [J]. IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024,
  • [24] INVESTIGATING CROSS-DATASET ABNORMALITY DETECTION IN ENDOSCOPY WITH A WEAKLY-SUPERVISED MULTISCALE CONVOLUTIONAL NEURAL NETWORK
    Diamantis, Dimitris
    Iakovidis, Dimitris K.
    Koulaouzidis, Anastasios
    [J]. 2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 3124 - 3128
  • [25] Personality first in emotion: a deep neural network based on electroencephalogram channel attention for cross-subject emotion recognition
    Tian, Zhihang
    Huang, Dongmin
    Zhou, Sijin
    Zhao, Zhidan
    Jiang, Dazhi
    [J]. ROYAL SOCIETY OPEN SCIENCE, 2021, 8 (08):
  • [26] Domain Adaptation for Cross-Subject Emotion Recognition by Subject Clustering
    Liu, Jin
    Shen, Xinke
    Song, Sen
    Zhang, Dan
    [J]. 2021 10TH INTERNATIONAL IEEE/EMBS CONFERENCE ON NEURAL ENGINEERING (NER), 2021, : 904 - 908
  • [27] Cross-subject EEG linear domain adaption based on batch normalization and depthwise convolutional neural network
    Li, Guofa
    Ouyang, Delin
    Yang, Liu
    Li, Qingkun
    Tian, Kai
    Wu, Baiheng
    Guo, Gang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 280
  • [28] A Novel Experiment Setting for Cross-subject Emotion Recognition
    Hu, Hao-Yi
    Zhao, Li-Ming
    Liu, Yu-Zhong
    Li, Hua-Liang
    Lu, Bao-Liang
    [J]. 2021 43RD ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE & BIOLOGY SOCIETY (EMBC), 2021, : 6416 - 6419
  • [29] Cross-Subject Emotion Recognition Based on Domain Similarity of EEG Signal Transfer Learning
    Ma, Yuliang
    Zhao, Weicheng
    Meng, Ming
    Zhang, Qizhong
    She, Qingshan
    Zhang, Jianhai
    [J]. IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, 2023, 31 : 936 - 943
  • [30] JOINT TEMPORAL CONVOLUTIONAL NETWORKS AND ADVERSARIAL DISCRIMINATIVE DOMAIN ADAPTATION FOR EEG-BASED CROSS-SUBJECT EMOTION RECOGNITION
    He, Zhipeng
    Zhong, Yongshi
    Pan, Jiahui
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 3214 - 3218