Multi-head self-attention mechanism-based global feature learning model for ASD diagnosis

被引:2
|
作者
Zhao, Feng [1 ]
Feng, Fan [1 ]
Ye, Shixin [1 ]
Mao, Yanyan [1 ]
Chen, Xiaobo [1 ]
Li, Yuan [2 ]
Ning, Mao [3 ]
Zhang, Mingli [1 ]
机构
[1] Shandong Technol & Business Univ, Sch Comp Sci & Technol, Yantai, Peoples R China
[2] Shandong Technol & Business Univ, Sch Management Sci & Engn, Yantai, Peoples R China
[3] Yantai Yuhuangding Hosp, Dept Radiol, Yantai, Peoples R China
基金
中国国家自然科学基金;
关键词
Austism Spectrum Disorder; rs-fMRI; Self; -attention; Sliding window; DISORDER; CHILDREN;
D O I
10.1016/j.bspc.2024.106090
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Background and Objectives: The static functional connectivity (SFC) networks based on resting-state functional MRI (rs-fMRI) typically focus on local correlations between specific brain regions, neglecting the broader connections across the entire brain. This limitation can hinder the accurate diagnosis of neurological conditions such as Autism Spectrum Disorder (ASD). This study aimed to overcome this limitation and improve ASD dentification accuracy. Methods: We propose a self-attention based ASD classification model. Employing sliding windows with longer window width, we locally sample the original data to increase the training sample size, thereby alleviating model overfitting. Subsequently, we introduce the multi-head self-attention mechanism, forming a deep model composed of stacked attention blocks. This ensure the capture of not only local correlations but also overall brain network features, significantly enhancing the classification accuracy of ASD. Results: Our proposed model was evaluated on fMRI data from the ABIDE NYU site. Experimental results demonstrated an accuracy of 81.47%, a sensitivity of 83.8%, and a specificity of 80.16%. Compared to other methods in the literature, our approach exhibited superior accuracy. Furthermore, the experiments revealed that the biomarkers used by the model for classification are primarily distributed across brain regions such as the superior frontal gyrus, middle frontal gyrus, and hippocampus, aligning with previous research findings. Conclusion: The sliding window method effectively enriches the dataset and alleviates overfitting. Simultaneously, the suggested model, which relies on self-attention mechanisms, has the ability to effectively extract global information from brain regions, providing a viable method to improve the accuracy of ASD identification.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] Multi-head Self-attention Recommendation Model based on Feature Interaction Enhancement
    Yin, Yunfei
    Huang, Caihao
    Sun, Jingqin
    Huang, Faliang
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 1740 - 1745
  • [2] Epilepsy detection based on multi-head self-attention mechanism
    Ru, Yandong
    An, Gaoyang
    Wei, Zheng
    Chen, Hongming
    [J]. PLOS ONE, 2024, 19 (06):
  • [3] Deep Bug Triage Model Based on Multi-head Self-attention Mechanism
    Yu, Xu
    Wan, Fayang
    Tang, Bin
    Zhan, Dingjia
    Peng, Qinglong
    Yu, Miao
    Wang, Zhaozhe
    Cui, Shuang
    [J]. COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2021, PT II, 2022, 1492 : 107 - 119
  • [4] Arrhythmia classification algorithm based on multi-head self-attention mechanism
    Wang, Yue
    Yang, Guanci
    Li, Shaobo
    Li, Yang
    He, Ling
    Liu, Dan
    [J]. BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2023, 79
  • [5] Speech enhancement method based on the multi-head self-attention mechanism
    Chang X.
    Zhang Y.
    Yang L.
    Kou J.
    Wang X.
    Xu D.
    [J]. Xi'an Dianzi Keji Daxue Xuebao/Journal of Xidian University, 2020, 47 (01): : 104 - 110
  • [6] A Bearing Fault Diagnosis Method Based on Dilated Convolution and Multi-Head Self-Attention Mechanism
    Hou, Peng
    Zhang, Jianjie
    Jiang, Zhangzheng
    Tang, Yiyu
    Lin, Ying
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (23):
  • [7] Text summarization based on multi-head self-attention mechanism and pointer network
    Qiu, Dong
    Yang, Bing
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2022, 8 (01) : 555 - 567
  • [8] Text summarization based on multi-head self-attention mechanism and pointer network
    Dong Qiu
    Bing Yang
    [J]. Complex & Intelligent Systems, 2022, 8 : 555 - 567
  • [9] Adaptive Pruning for Multi-Head Self-Attention
    Messaoud, Walid
    Trabelsi, Rim
    Cabani, Adnane
    Abdelkefi, Fatma
    [J]. ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING, ICAISC 2023, PT II, 2023, 14126 : 48 - 57
  • [10] Attention as Relation: Learning Supervised Multi-head Self-Attention for Relation Extraction
    Liu, Jie
    Chen, Shaowei
    Wang, Bingquan
    Zhang, Jiaxin
    Li, Na
    Xu, Tong
    [J]. PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 3787 - 3793