DCT based multi-head attention-BiGRU model for EEG source location

被引:10
|
作者
Zhang, Boyuan [1 ]
Li, Donghao [2 ,3 ]
Wang, Dongqing [1 ]
机构
[1] Qingdao Univ, Dept Coll Elect Engn Inst, 308 Ningxia Rd, Qingdao 266071, Peoples R China
[2] Syracuse Univ, Dept Sch Engn, Syracuse, NY 13244 USA
[3] Syracuse Univ, Comp Sci Inst, Syracuse, NY 13244 USA
基金
中国国家自然科学基金;
关键词
Multi -head attention; Discrete cosine transform; Bidirectional gated recurrent unit; Spatial low frequency components; BOUNDARY-ELEMENT METHOD; SOURCE LOCALIZATION; INVERSE PROBLEM; BRAIN; CLASSIFICATION; STABILITY; NETWORK;
D O I
10.1016/j.bspc.2024.106171
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Electroencephalogram source imaging (ESI) pertains to localize brain sources. Due to the one-to-many relationship between electroencephalogram (EEG) signals and brain sources, ESI becomes a complex inverse problem with poor conditioning, leaving a lot of research space. This article proposes an enhanced multi-head attention (MA) and discrete cosine transform (DCT) based Bidirectional Gated Recurrent Unit (BiGRU) (MA-DCT-BiGRU for short) for addressing the EEG inverse problem. Initially, EEG signal characteristics are captured employing multi-head attention with the inclusion of average hidden states. Then, the DCT is used to project brain source signals into the low, medium, and high frequency subspaces composed of spatial frequency basis vectors. The spatial low-frequency component serves as a filter for extended source reconstruction. Subsequently, BiGRU is employed to learn the mapping from the output of the attention layer to the low frequency DCT coefficients of the brain-derived signals. The simulation results undeniably establish the superiority of the MA-DCT-BiGRU configuration in comparison to other state-of-the-art (SOA) methods for source recovery, irrespective of the source modes and signal-to-noise ratio conditions. Experimental results, utilizing both synthetic and actual epilepsy data, clearly demonstrate the effectiveness of the framework presented in this article for epileptogenic area localization.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] MRE: A Military Relation Extraction Model Based on BiGRU and Multi-Head Attention
    Lu, Yiwei
    Yang, Ruopeng
    Jiang, Xuping
    Zhou, Dan
    Yin, Changsheng
    Li, Zizhuo
    [J]. SYMMETRY-BASEL, 2021, 13 (09):
  • [2] Multi-label Text Classification Based on BiGRU and Multi-Head Self-Attention Mechanism
    Luo, Tongtong
    Shi, Nan
    Jin, Meilin
    Qin, Aolong
    Tang, Jiacheng
    Wang, Xihan
    Gao, Quanli
    Shao, Lianhe
    [J]. 2024 3RD INTERNATIONAL CONFERENCE ON IMAGE PROCESSING AND MEDIA COMPUTING, ICIPMC 2024, 2024, : 204 - 210
  • [3] Life Prediction ofWind Turbine Based on Attention-BiGRU
    Lv, Da
    Zhang, Chao
    Fei, Hongbo
    Zhao, Wentao
    Dong, Changhao
    Pang, Yongzhi
    [J]. PROCEEDINGS OF TEPEN 2022, 2023, 129 : 367 - 377
  • [4] A Novel Source Code Representation Approach Based on Multi-Head Attention
    Xiao, Lei
    Zhong, Hao
    Liu, Jianjian
    Zhang, Kaiyu
    Xu, Qizhen
    Chang, Le
    [J]. ELECTRONICS, 2024, 13 (11)
  • [5] Combining Multi-Head Attention and Sparse Multi-Head Attention Networks for Session-Based Recommendation
    Zhao, Zhiwei
    Wang, Xiaoye
    Xiao, Yingyuan
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [6] A Novel Knowledge Tracing Model Based on Collaborative Multi-Head Attention
    Zhang Wei
    Qu Kaiyuan
    Han Yahui
    Tan Longan
    [J]. 6TH INTERNATIONAL CONFERENCE ON INNOVATION IN ARTIFICIAL INTELLIGENCE, ICIAI2022, 2022, : 210 - 215
  • [7] Fault location method for distribution networks based on multi-head graph attention networks
    Liang, Lingyu
    Zhang, Huanming
    Cao, Shang
    Zhao, Xiangyu
    Li, Hanju
    Chen, Zhiwei
    [J]. FRONTIERS IN ENERGY RESEARCH, 2024, 12
  • [8] Text classification model based on multi-head attention capsule neworks
    Jia X.
    Wang L.
    [J]. Qinghua Daxue Xuebao/Journal of Tsinghua University, 2020, 60 (05): : 415 - 421
  • [9] Machine Reading Comprehension Model Based on Multi-head Attention Mechanism
    Xue, Yong
    [J]. ADVANCED INTELLIGENT TECHNOLOGIES FOR INDUSTRY, 2022, 285 : 45 - 58
  • [10] POI Recommendation Model Using Multi-Head Attention in Location-Based Social Network Big Data
    Liu, Xiaoqiang
    [J]. INTERNATIONAL JOURNAL OF INFORMATION TECHNOLOGIES AND SYSTEMS APPROACH, 2023, 16 (02)