Research on EEG emotion recognition based on CNN+BiLSTM+self-attention model

被引:0
|
作者
LI Xueqing [1 ]
LI Penghai [1 ]
FANG Zhendong [2 ]
CHENG Longlong [3 ]
WANG Zhiyong [1 ]
WANG Weijie [4 ]
机构
[1] School of Integrated Circuit Science and Engineering, Tianjin University of Technology
[2] China France BOHAI Geoservices Co., Ltd.
[3] China Electronics Cloud Brain (Tianjin) Technology Co., Ltd.
[4] Institute of Motion Analysis and Research, University of Dundee
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TN911.7 [信号处理]; TP183 [人工神经网络与计算]; R318 [生物医学工程];
学科分类号
0711 ; 080401 ; 080402 ; 081104 ; 0812 ; 0831 ; 0835 ; 1405 ;
摘要
To address the problems of insufficient dimensionality of electroencephalogram(EEG) feature extraction, the tendency to ignore the importance of different sequential data segments, and the poor generalization ability of the model in EEG based emotion recognition, the model of convolutional neural network and bi-directional long short-term memory and self-attention(CNN+Bi LSTM+self-attention) is proposed. This model uses convolutional neural network(CNN) to extract more distinctive features from both spatial and temporal dimensions. The bi-directional long short-term memory(Bi LSTM) is used to further preserve the long-term dependencies between the temporal phases of sequential data. The self-attention mechanism can change the weights of different channels to extract and highlight important information and address the often-ignored importance of different channels and samples when extracting EEG features. The subject-dependent experiment and subject-independent experiment are performed on the database for emotion analysis using physiological signals(DEAP) and collected datasets to verify the recognition performance. The experimental results show that the model proposed in this paper has excellent recognition performance and generalization ability.
引用
下载
收藏
页码:506 / 512
页数:7
相关论文
共 50 条
  • [31] Customized 2D CNN Model for the Automatic Emotion Recognition Based on EEG Signals
    Baradaran, Farzad
    Farzan, Ali
    Danishvar, Sebelan
    Sheykhivand, Sobhan
    ELECTRONICS, 2023, 12 (10)
  • [32] Hybrid LSTM–Attention and CNN Model for Enhanced Speech Emotion Recognition
    Makhmudov, Fazliddin
    Kutlimuratov, Alpamis
    Cho, Young-Im
    Applied Sciences (Switzerland), 2024, 14 (23):
  • [33] A Bi-Stream hybrid model with MLPBlocks and self-attention mechanism for EEG-based emotion recognition
    Li, Wei
    Tian, Ye
    Hou, Bowen
    Dong, Jianzhang
    Shao, Shitong
    Song, Aiguo
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 86
  • [34] EEG Emotion Recognition Network Based on Attention and Spatiotemporal Convolution
    Zhu, Xiaoliang
    Liu, Chen
    Zhao, Liang
    Wang, Shengming
    SENSORS, 2024, 24 (11)
  • [35] Enhancing Emotion Recognition in Text with Stacked CNN-BiLSTM Framework
    Banu, N. Nasrin
    Senthilkumar, Radha
    Mohesh, B.
    Giridhar, N. Sabari
    Shanmugasundaram, G.
    2024 INTERNATIONAL CONFERENCE ON ADVANCES IN COMPUTING, COMMUNICATION AND APPLIED INFORMATICS, ACCAI 2024, 2024,
  • [36] A Multi-Column CNN Model for Emotion Recognition from EEG Signals
    Yang, Heekyung
    Han, Jongdae
    Min, Kyungha
    SENSORS, 2019, 19 (21)
  • [37] Classification of EEG Signal by WT-CNN Model In Emotion Recognition System
    Zhang, Benyu
    Jiang, Huiping
    Dong, Linshan
    2017 IEEE 16TH INTERNATIONAL CONFERENCE ON COGNITIVE INFORMATICS & COGNITIVE COMPUTING (ICCI*CC), 2017, : 109 - 114
  • [38] Research on Named Entity Recognition Method of Metro On-Board Equipment Based on Multiheaded Self-Attention Mechanism and CNN-BiLSTM-CRF
    Lin, Junting
    Liu, Endong
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [39] A novel transferability attention neural network model for EEG emotion recognition
    Li, Yang
    Fu, Boxun
    Li, Fu
    Shi, Guangming
    Zheng, Wenming
    NEUROCOMPUTING, 2021, 447 : 92 - 101
  • [40] Network Intrusion Detection Method Based on CNN-BiLSTM-Attention Model
    Dai, Wei
    Li, Xinhui
    Ji, Wenxin
    He, Sicheng
    IEEE ACCESS, 2024, 12 : 53099 - 53111