Electroencephalogram-based emotion recognition using factorization temporal separable convolution network

被引:2
|
作者
Yang, Lijun [1 ,2 ]
Wang, Yixin [1 ]
Ouyang, Rujie [1 ]
Niu, Xiaolong [1 ]
Yang, Xiaohui [1 ,2 ]
Zheng, Chen [1 ,2 ]
机构
[1] Henan Univ, Henan Engn Res Ctr Artificial Intelligence Theory, Sch Math & Stat, Kaifeng 475004, Peoples R China
[2] Henan Univ, Ctr Appl Math Henan Prov, Zhengzhou 450046, Peoples R China
基金
中国国家自然科学基金;
关键词
Electroencephalogram; Emotion recognition; Temporal convolution network; Factorization machine;
D O I
10.1016/j.engappai.2024.108011
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Temporal Convolutional Networks (TCNs) expand their receptive field through dilated convolutions, which is essential for capturing dependencies in longer sequences. This characteristic is especially critical for detecting long-term patterns in Electroencephalogram (EEG) data, making TCNs a suitable choice for EEG -based emotion recognition. In this study, to effectively capture the interaction between features, we incorporate Factorization Machines (FM) into the TCN model, proposing the Factorization Temporal Convolution Network (FTCN) for EEG -based emotion recognition. On one hand, the FTCN model enhances understanding of temporal dynamics by capturing long-term dependencies in time series data through TCN. On the other hand, it combines FM to increase the model's expression ability in the feature dimension, allowing for a more comprehensive understanding of EEG data. Building on this, separable convolutions are incorporated into the FTCN to develop the Factorization Temporal Separable Convolution Network (FTSCN). This approach reduces the model's parameter count by splitting standard convolutions into two simpler operations, thus accelerating training and inference. Experiments on the DEAP and SEED datasets demonstrate the effectiveness of these two models, showcasing competitive recognition accuracy compared to published methods. Specifically, on the DEAP dataset, the recognition accuracies for arousal and valence achieve 97.39% +/- 1.93 and 97.55% +/- 1.65. For the four -class task, the accuracy reaches 95.43% +/- 2.43 ; on the SEED dataset, the recognition accuracy can reach 89.13% +/- 4.49.
引用
收藏
页数:13
相关论文
共 50 条
  • [21] Multi-Source Domain Transfer Discriminative Dictionary Learning Modeling for Electroencephalogram-Based Emotion Recognition
    Gu, Xiaoqing
    Cai, Weiwei
    Gao, Ming
    Jiang, Yizhang
    Ning, Xin
    Qian, Pengjiang
    [J]. IEEE TRANSACTIONS ON COMPUTATIONAL SOCIAL SYSTEMS, 2022, 9 (06) : 1604 - 1612
  • [22] Hybrid Network Using Dynamic Graph Convolution and Temporal Self-Attention for EEG-Based Emotion Recognition
    Cheng, Cheng
    Yu, Zikang
    Zhang, Yong
    Feng, Lin
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 11
  • [23] Recognition method for adhesive fish based on depthwise separable convolution network
    Zhang, Lu
    Li, Daoliang
    Cao, Xinkai
    Li, Wensheng
    Tian, Ganglu
    Duan, Qingling
    [J]. Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, 2021, 37 (17): : 160 - 167
  • [24] Electroencephalogram based face emotion recognition using multimodal fusion and 1-D convolution neural network (ID-CNN) classifier
    Alotaibi, Youseef
    Vuyyuru, Veera Ankalu.
    [J]. AIMS MATHEMATICS, 2023, 8 (10): : 22984 - 23002
  • [25] EEG Emotion Recognition Network Based on Attention and Spatiotemporal Convolution
    Zhu, Xiaoliang
    Liu, Chen
    Zhao, Liang
    Wang, Shengming
    [J]. SENSORS, 2024, 24 (11)
  • [26] Challenges and Future Perspectives on Electroencephalogram-Based Biometrics in Person Recognition
    Chan, Hui-Ling
    Kuo, Po-Chih
    Cheng, Chia-Yi
    Chen, Yong-Sheng
    [J]. FRONTIERS IN NEUROINFORMATICS, 2018, 12
  • [27] A Convolution Neural Network Based Emotion Recognition System using Multimodal Physiological Signals
    Yang, Cheng-Jie
    Fahier, Nicolas
    Li, Wei-Chih
    Fang, Wai-Chi
    [J]. 2020 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS - TAIWAN (ICCE-TAIWAN), 2020,
  • [28] The technique of emotion recognition based on electroencephalogram
    Osaka, Kyoko
    Tsuchiya, Seiji
    Ren, Fuji
    Kuroiwa, Shingo
    Tanioka, Tetsuya
    Locsin, Rozzano C.
    [J]. INFORMATION-AN INTERNATIONAL INTERDISCIPLINARY JOURNAL, 2008, 11 (01): : 55 - 68
  • [29] Emotion recognition based on photoplethysmogram and electroencephalogram
    Tong, Zhongkai
    Chen, XianXiang
    He, Zhengling
    Tong, Kai
    Fang, Zhen
    Wang, Xianlong
    [J]. 2018 IEEE 42ND ANNUAL COMPUTER SOFTWARE AND APPLICATIONS CONFERENCE (COMPSAC 2018), VOL 2, 2018, : 402 - 407
  • [30] Emotion recognition based on microstate analysis from temporal and spatial patterns of electroencephalogram
    Wei, Zhen
    Li, Hongwei
    Ma, Lin
    Li, Haifeng
    [J]. FRONTIERS IN NEUROSCIENCE, 2024, 18