PTGB: Pre-Train Graph Neural Networks for Brain Network Analysis

被引:0
|
作者
Yang, Yi [1 ]
Cui, Hejie [1 ]
Yang, Carl [1 ]
机构
[1] Emory Univ, Atlanta, GA 30322 USA
来源
CONFERENCE ON HEALTH, INFERENCE, AND LEARNING, VOL 209 | 2023年 / 209卷
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The human brain is the central hub of the neurobiological system, controlling behavior and cognition in complex ways. Recent advances in neuroscience and neuroimaging analysis have shown a growing interest in the interactions between brain regions of interest (ROIs) and their impact on neural development and disorder diagnosis. As a powerful deep model for analyzing graph-structured data, Graph Neural Networks (GNNs) have been applied for brain network analysis. However, training deep models requires large amounts of labeled data, which is often scarce in brain network datasets due to the complexities of data acquisition and sharing restrictions. To make the most out of available training data, we propose PTGB, a GNN pretraining framework that captures intrinsic brain network structures, regardless of clinical outcomes, and is easily adaptable to various downstream tasks. PTGB comprises two key components: (1) an unsupervised pre-training technique designed specifically for brain networks, which enables learning from large-scale datasets without task-specific labels; (2) a data-driven parcellation atlas mapping pipeline that facilitates knowledge transfer across datasets with different ROI systems. Extensive evaluations using various GNN models have demonstrated the robust and superior performance of PTGB compared to baseline methods.
引用
收藏
页码:526 / 544
页数:19
相关论文
共 50 条
  • [1] Learning to Pre-train Graph Neural Networks
    Lu, Yuanfu
    Jiang, Xunqiang
    Fang, Yuan
    Shi, Chuan
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 4276 - 4284
  • [2] Pre-Train and Learn: Preserving Global Information for Graph Neural Networks
    Zhu, Dan-Hao
    Dai, Xin-Yu
    Chen, Jia-Jun
    JOURNAL OF COMPUTER SCIENCE AND TECHNOLOGY, 2021, 36 (06) : 1420 - 1430
  • [3] Pre-Train and Learn: Preserving Global Information for Graph Neural Networks
    Dan-Hao Zhu
    Xin-Yu Dai
    Jia-Jun Chen
    Journal of Computer Science and Technology, 2021, 36 : 1420 - 1430
  • [4] When to Pre-Train Graph Neural Networks? From Data Generation Perspective!
    Cao, Yuxuan
    Xu, Jiarong
    Yang, Carl
    Wang, Jiaan
    Zhang, Yunchao
    Wang, Chunping
    Chen, Lei
    Yang, Yang
    PROCEEDINGS OF THE 29TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2023, 2023, : 142 - 153
  • [5] Pre-train Unified Knowledge Graph Embedding with Ontology
    Song, Tengwei
    Luo, Jie
    Chen, Xiangyu
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2022, 13368 : 85 - 92
  • [6] Deep neural networks with pre-train model BERT for aspect-level sentiments classification
    Zhang, Yunxiang
    Rao, Zhuyi
    PROCEEDINGS OF 2020 IEEE 5TH INFORMATION TECHNOLOGY AND MECHATRONICS ENGINEERING CONFERENCE (ITOEC 2020), 2020, : 923 - 927
  • [7] BrainGB: A Benchmark for Brain Network Analysis With Graph Neural Networks
    Cui, Hejie
    Dai, Wei
    Zhu, Yanqiao
    Kan, Xuan
    Gu, Antonio Aodong Chen
    Lukemire, Joshua
    Zhan, Liang
    He, Lifang
    Guo, Ying
    Yang, Carl
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2023, 42 (02) : 493 - 506
  • [8] Deep reinforcement learning guided graph neural networks for brain network analysis
    Zhao, Xusheng
    Wu, Jia
    Peng, Hao
    Beheshti, Amin
    Monaghan, Jessica J. M.
    McAlpine, David
    Hernandez-Perez, Heivet
    Dras, Mark
    Dai, Qiong
    Li, Yangyang
    Yu, Philip S.
    He, Lifang
    NEURAL NETWORKS, 2022, 154 : 56 - 67
  • [9] Rethinking Network Pruning-under the Pre-train and Fine-tune Paradigm
    Xu, Dongkuan
    Yen, Ian En-Hsu
    Zhao, Jinxi
    Xiao, Zhibin
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 2376 - 2382
  • [10] Train Once and Explain Everywhere: Pre-training Interpretable Graph Neural Networks
    Yin, Jun
    Li, Chaozhuo
    Yan, Hao
    Lian, Jianxun
    Wang, Senzhang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,