Multi-head Attention-Based Masked Sequence Model for Mapping Functional Brain Networks

被引:3
|
作者
He, Mengshen [1 ]
Hou, Xiangyu [1 ]
Wang, Zhenwei [1 ]
Kang, Zili [1 ]
Zhang, Xin [2 ]
Qiang, Ning [1 ]
Ge, Bao [1 ]
机构
[1] Shaanxi Normal Univ, Sch Phys & Informat Technol, Xian, Peoples R China
[2] Northwestern Polytech Univ, Xian, Peoples R China
基金
中国国家自然科学基金;
关键词
Masked sequence modeling; Multi-head attention; Functional networks; ARCHITECTURE; FMRI;
D O I
10.1007/978-3-031-16431-6_28
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
It has been of great interest in the neuroimaging community to discover brain functional networks (FBNs) based on task functional magnetic resonance imaging (tfMRI). A variety of methods have been used to model tfMRI sequences so far, such as recurrent neural network (RNN) and Autoencoder. However, these models are not designed to incorporate the characteristics of tfMRI sequences, and the same signal values at different time points in a fMRI time series may rep-resent different states and meanings. Inspired by doze learning methods and the human ability to judge polysemous words based on context, we proposed a self-supervised a Multi-head Attention-based Masked Sequence Model (MAMSM), as BERT model uses (Masked Language Modeling) MLM and multi-head attention to learn the different meanings of the same word in different sentences. MAMSM masks and encodes tfMRI time series, uses multi-head attention to calculate different meanings corresponding to the same signal value in fMRI sequence, and obtains context information through MSM pre-training. Furthermore this work redefined a new loss function to extract FBNs according to the task de-sign information of tfMRI data. The model has been applied to the Human Connectome Project (HCP) task fMRI dataset and achieves state-of-the-art performance in brain temporal dynamics, the Pearson correlation coefficient between learning features and task design curves was more than 0.95, and the model can extract more meaningful network besides the known task related brain networks.
引用
收藏
页码:295 / 304
页数:10
相关论文
共 50 条
  • [21] Improving Multi-head Attention with Capsule Networks
    Gu, Shuhao
    Feng, Yang
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING (NLPCC 2019), PT I, 2019, 11838 : 314 - 326
  • [22] MAGRU-IDS: A Multi-Head Attention-Based Gated Recurrent Unit for Intrusion Detection in IIoT Networks
    Ullah, Safi
    Boulila, Wadii
    Koubaa, Anis
    Ahmad, Jawad
    [J]. IEEE ACCESS, 2023, 11 : 114590 - 114601
  • [23] Masked multi-head self-attention for causal speech enhancement
    Nicolson, Aaron
    Paliwal, Kuldip K.
    [J]. SPEECH COMMUNICATION, 2020, 125 : 80 - 96
  • [24] Multi-Head Attention-Based Hybrid Deep Neural Network for Aeroengine Risk Assessment
    Li, Jian-Hang
    Gao, Xin-Yue
    Lu, Xiang
    Liu, Guo-Dong
    [J]. IEEE ACCESS, 2023, 11 : 113376 - 113389
  • [25] Internal defects inspection of arc magnets using multi-head attention-based CNN
    Li, Qiang
    Huang, Qinyuan
    Yang, Tian
    Zhou, Ying
    Yang, Kun
    Song, Hong
    [J]. MEASUREMENT, 2022, 202
  • [26] Multiscaled Multi-Head Attention-Based Video Transformer Network for Hand Gesture Recognition
    Garg, Mallika
    Ghosh, Debashis
    Pradhan, Pyari Mohan
    [J]. IEEE SIGNAL PROCESSING LETTERS, 2023, 30 : 80 - 84
  • [27] Enhancing Recommendation Capabilities Using Multi-Head Attention-Based Federated Knowledge Distillation
    Wu, Aming
    Kwon, Young-Woo
    [J]. IEEE ACCESS, 2023, 11 : 45850 - 45861
  • [28] Duplicate Question Detection based on Neural Networks and Multi-head Attention
    Zhang, Heng
    Chen, Liangyu
    [J]. PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP), 2019, : 13 - 18
  • [29] Multi-head multi-order graph attention networks
    Ben, Jie
    Sun, Qiguo
    Liu, Keyu
    Yang, Xibei
    Zhang, Fengjun
    [J]. APPLIED INTELLIGENCE, 2024, 54 (17-18) : 8092 - 8107
  • [30] Acoustic Scene Analysis with Multi-head Attention Networks
    Wang, Weimin
    Wang, Weiran
    Sun, Ming
    Wang, Chao
    [J]. INTERSPEECH 2020, 2020, : 1191 - 1195