Exploring Implicit Biological Heterogeneity in ASD Diagnosis Using a Multi-Head Attention Graph Neural Network

被引:0
|
作者
Moon, Hyung-Jun [1 ]
Cho, Sung-Bae [2 ]
机构
[1] Yonsei Univ, Dept Artificial Intelligence, Seoul 03722, South Korea
[2] Yonsei Univ, Dept Comp Sci, Seoul 03722, South Korea
关键词
autism spectrum disorder; dynamic functional connectivity; graph neural network; multi-head attention; AUTISM; AMYGDALA; CHILDREN; CLASSIFICATION; PREDICTION; MICROGLIA;
D O I
10.31083/j.jin2307135
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Background: Autism spectrum disorder (ASD) is a neurodevelopmental disorder exhibiting heterogeneous characteristics in patients, including variability in developmental progression and distinct neuroanatomical features influenced by sex and age. Recent advances in deep learning models based on functional connectivity (FC) graphs have produced promising results, but they have focused on generalized global activation patterns and failed to capture specialized regional characteristics and accurately assess disease indications. Methods: To overcome these limitations, we propose a novel deep learning method that models FC with multi-head attention, which enables simultaneous modeling of the intricate and variable patterns of brain connectivity associated with ASD, effectively extracting abnormal patterns of brain connectivity. The proposed method not only identifies region-specific correlations but also emphasizes connections at specific, transient time points from diverse perspectives. The extracted FC is transformed into a graph, assigning weighted labels to the edges to reflect the degree of correlation, which is then processed using a graph neural network capable of handling edge labels. Results: Experiments on the autism brain imaging data exchange (ABIDE) I and II datasets, which include a heterogeneous cohort, showed superior performance over the state-of-the-art methods, improving accuracy by up to 3.7%p. The incorporation of multi-head attention in FC analysis markedly improved the distinction between typical brains and those affected by ASD. Additionally, the ablation study validated diverse brain characteristics in ASD patients across different ages and sexes, offering insightful interpretations. Conclusion: These results emphasize the effectiveness of the method in enhancing diagnostic accuracy and its potential in advancing neurological research for ASD diagnosis.
引用
收藏
页数:14
相关论文
共 50 条
  • [41] Bidirectional recurrent neural network with multi-head attention for automatic scene generation using sentiment analysis
    Dharaniya, R.
    Indumathi, J.
    Uma, G., V
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 43 (06) : 7023 - 7039
  • [42] Using recurrent neural network structure with Enhanced Multi-Head Self-Attention for sentiment analysis
    Leng, Xue-Liang
    Miao, Xiao-Ai
    Liu, Tao
    MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (08) : 12581 - 12600
  • [43] A Multi-Head Convolutional Neural Network with Multi-Path Attention Improves Image Denoising
    Zhang, Jiahong
    Qu, Meijun
    Wang, Ye
    Cao, Lihong
    PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT III, 2022, 13631 : 338 - 351
  • [44] An artificial neural network using multi-head intermolecular attention for predicting chemical reactivity of organic materials
    Yoo, Jaekyun
    Kim, Byunghoon
    Lee, Byungju
    Song, Jun-hyuk
    Kang, Kisuk
    JOURNAL OF MATERIALS CHEMISTRY A, 2023, 11 (24) : 12784 - 12792
  • [45] Prediction of Lithium Battery Voltage and State of Charge Using Multi-Head Attention BiLSTM Neural Network
    Xi, Haiwen
    Lv, Taolin
    Qin, Jincheng
    Ma, Mingsheng
    Xie, Jingying
    Lu, Shigang
    Liu, Zhifu
    APPLIED SCIENCES-BASEL, 2025, 15 (06):
  • [46] RMAN: Relational multi-head attention neural network for joint extraction of entities and relations
    Taiqu Lai
    Lianglun Cheng
    Depei Wang
    Haiming Ye
    Weiwen Zhang
    Applied Intelligence, 2022, 52 : 3132 - 3142
  • [47] TMH: Two-Tower Multi-Head Attention neural network for CTR prediction
    An, Zijian
    Joe, Inwhee
    PLOS ONE, 2024, 19 (03):
  • [48] RMAN: Relational multi-head attention neural network for joint extraction of entities and relations
    Lai, Taiqu
    Cheng, Lianglun
    Wang, Depei
    Ye, Haiming
    Zhang, Weiwen
    APPLIED INTELLIGENCE, 2022, 52 (03) : 3132 - 3142
  • [49] RFAN: Relation-fused multi-head attention network for knowledge graph enhanced recommendation
    Duan, Huajuan
    Liu, Peiyu
    Ding, Qi
    APPLIED INTELLIGENCE, 2023, 53 (01) : 1068 - 1083
  • [50] Multi-head self-attention mechanism-based global feature learning model for ASD diagnosis
    Zhao, Feng
    Feng, Fan
    Ye, Shixin
    Mao, Yanyan
    Chen, Xiaobo
    Li, Yuan
    Ning, Mao
    Zhang, Mingli
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 91