Multi-head Attention-Based Masked Sequence Model for Mapping Functional Brain Networks

被引:3
|
作者
He, Mengshen [1 ]
Hou, Xiangyu [1 ]
Wang, Zhenwei [1 ]
Kang, Zili [1 ]
Zhang, Xin [2 ]
Qiang, Ning [1 ]
Ge, Bao [1 ]
机构
[1] Shaanxi Normal Univ, Sch Phys & Informat Technol, Xian, Peoples R China
[2] Northwestern Polytech Univ, Xian, Peoples R China
基金
中国国家自然科学基金;
关键词
Masked sequence modeling; Multi-head attention; Functional networks; ARCHITECTURE; FMRI;
D O I
10.1007/978-3-031-16431-6_28
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
It has been of great interest in the neuroimaging community to discover brain functional networks (FBNs) based on task functional magnetic resonance imaging (tfMRI). A variety of methods have been used to model tfMRI sequences so far, such as recurrent neural network (RNN) and Autoencoder. However, these models are not designed to incorporate the characteristics of tfMRI sequences, and the same signal values at different time points in a fMRI time series may rep-resent different states and meanings. Inspired by doze learning methods and the human ability to judge polysemous words based on context, we proposed a self-supervised a Multi-head Attention-based Masked Sequence Model (MAMSM), as BERT model uses (Masked Language Modeling) MLM and multi-head attention to learn the different meanings of the same word in different sentences. MAMSM masks and encodes tfMRI time series, uses multi-head attention to calculate different meanings corresponding to the same signal value in fMRI sequence, and obtains context information through MSM pre-training. Furthermore this work redefined a new loss function to extract FBNs according to the task de-sign information of tfMRI data. The model has been applied to the Human Connectome Project (HCP) task fMRI dataset and achieves state-of-the-art performance in brain temporal dynamics, the Pearson correlation coefficient between learning features and task design curves was more than 0.95, and the model can extract more meaningful network besides the known task related brain networks.
引用
收藏
页码:295 / 304
页数:10
相关论文
共 50 条
  • [41] MedGraph: malicious edge detection in temporal reciprocal graph via multi-head attention-based GNN
    Chen, Kai
    Wang, Ziao
    Liu, Kai
    Zhang, Xiaofeng
    Luo, Linhao
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (12): : 8919 - 8935
  • [42] Multi-head Attention Induced Dynamic Hypergraph Convolutional Networks
    Peng, Xu
    Lin, Wei
    Jin, Taisong
    [J]. PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT IX, 2024, 14433 : 256 - 268
  • [43] DeepCKID: A Multi-Head Attention-Based Deep Neural Network Model Leveraging Classwise Knowledge to Handle Imbalanced Textual Data
    Sah, Amit Kumar
    Abulaish, Muhammad
    [J]. MACHINE LEARNING WITH APPLICATIONS, 2024, 17
  • [44] MedGraph: malicious edge detection in temporal reciprocal graph via multi-head attention-based GNN
    Kai Chen
    Ziao Wang
    Kai Liu
    Xiaofeng Zhang
    Linhao Luo
    [J]. Neural Computing and Applications, 2023, 35 : 8919 - 8935
  • [45] Multi-Head Attention-Based Long Short-Term Memory for Depression Detection From Speech
    Zhao, Yan
    Liang, Zhenlin
    Du, Jing
    Zhang, Li
    Liu, Chengyu
    Zhao, Li
    [J]. FRONTIERS IN NEUROROBOTICS, 2021, 15
  • [46] MoMA: Momentum contrastive learning with multi-head attention-based knowledge distillation for histopathology image analysis
    Le Vuong, Trinh Thi
    Kwak, Jin Tae
    [J]. Medical Image Analysis, 2025, 101
  • [47] A Network Intrusion Detection Model Based on BiLSTM with Multi-Head Attention Mechanism
    Zhang, Jingqi
    Zhang, Xin
    Liu, Zhaojun
    Fu, Fa
    Jiao, Yihan
    Xu, Fei
    [J]. ELECTRONICS, 2023, 12 (19)
  • [48] MRE: A Military Relation Extraction Model Based on BiGRU and Multi-Head Attention
    Lu, Yiwei
    Yang, Ruopeng
    Jiang, Xuping
    Zhou, Dan
    Yin, Changsheng
    Li, Zizhuo
    [J]. SYMMETRY-BASEL, 2021, 13 (09):
  • [49] Multi-head attention model for aspect level sentiment analysis
    Zhang, Xinsheng
    Gao, Teng
    [J]. JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2020, 38 (01) : 89 - 96
  • [50] Hierarchical Gated Convolutional Networks with Multi-Head Attention for Text Classification
    Du, Haizhou
    Qian, Jingu
    [J]. 2018 5TH INTERNATIONAL CONFERENCE ON SYSTEMS AND INFORMATICS (ICSAI), 2018, : 1170 - 1175