Pre-training graph neural networks for link prediction in biomedical networks

被引:35
|
作者
Long, Yahui [1 ]
Wu, Min [2 ]
Liu, Yong [3 ]
Fang, Yuan [4 ]
Kwoh, Chee Keong [5 ]
Chen, Jinmiao [1 ]
Luo, Jiawei [6 ]
Li, Xiaoli [2 ]
机构
[1] Agcy Sci Technol & Res, Singapore Immunol Network SIgN, Singapore, Singapore
[2] Agcy Sci Technol & Res, Inst Infocomm Res, Singapore, Singapore
[3] Joint NTU UBC Res Ctr Excellence Act Living Elder, Singapore, Singapore
[4] Singapore Management Univ, Sch Informat Syst, Singapore 178902, Singapore
[5] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore, Singapore
[6] Hunan Univ, Coll Comp Sci & Elect Engn, Changsha, Peoples R China
基金
中国国家自然科学基金;
关键词
COMPLEX DISEASES; MODELS;
D O I
10.1093/bioinformatics/btac100
中图分类号
Q5 [生物化学];
学科分类号
071010 ; 081704 ;
摘要
Motivation Graphs or networks are widely utilized to model the interactions between different entities (e.g. proteins, drugs, etc.) for biomedical applications. Predicting potential interactions/links in biomedical networks is important for understanding the pathological mechanisms of various complex human diseases, as well as screening compound targets for drug discovery. Graph neural networks (GNNs) have been utilized for link prediction in various biomedical networks, which rely on the node features extracted from different data sources, e.g. sequence, structure and network data. However, it is challenging to effectively integrate these data sources and automatically extract features for different link prediction tasks. Results In this article, we propose a novel Pre-Training Graph Neural Networks-based framework named PT-GNN to integrate different data sources for link prediction in biomedical networks. First, we design expressive deep learning methods [e.g. convolutional neural network and graph convolutional network (GCN)] to learn features for individual nodes from sequence and structure data. Second, we further propose a GCN-based encoder to effectively refine the node features by modelling the dependencies among nodes in the network. Third, the node features are pre-trained based on graph reconstruction tasks. The pre-trained features can be used for model initialization in downstream tasks. Extensive experiments have been conducted on two critical link prediction tasks, i.e. synthetic lethality (SL) prediction and drug-target interaction (DTI) prediction. Experimental results demonstrate PT-GNN outperforms the state-of-the-art methods for SL prediction and DTI prediction. In addition, the pre-trained features benefit improving the performance and reduce the training time of existing models. Availability and implementation Python codes and dataset are available at: https://github.com/longyahui/PT-GNN. Contact: luojiawei@hnu.edu.cn or xlli@i2r.a-star.edu.sg
引用
收藏
页码:2254 / 2262
页数:9
相关论文
共 50 条
  • [1] Pre-training on dynamic graph neural networks
    Chen, Ke-Jia
    Zhang, Jiajun
    Jiang, Linpu
    Wang, Yunyun
    Dai, Yuxuan
    [J]. NEUROCOMPUTING, 2022, 500 : 679 - 687
  • [2] PHGNN: Pre-Training Heterogeneous Graph Neural Networks
    Li, Xin
    Wei, Hao
    Ding, Yu
    [J]. IEEE Access, 2024, 12 : 135411 - 135418
  • [3] PT-KGNN: A framework for pre-training biomedical knowledge graphs with graph neural networks
    Wang, Zhenxing
    Wei, Zhongyu
    [J]. Computers in Biology and Medicine, 2024, 178
  • [4] GPPT: Graph Pre-training and Prompt Tuning to Generalize Graph Neural Networks
    Sun, Mingchen
    Zhou, Kaixiong
    He, Xin
    Wang, Ying
    Wang, Xin
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1717 - 1727
  • [5] Neighborhood-enhanced contrast for pre-training graph neural networks
    Yichun Li
    Jin Huang
    Weihao Yu
    Tinghua Zhang
    [J]. Neural Computing and Applications, 2024, 36 : 4195 - 4205
  • [6] GPT-GNN: Generative Pre-Training of Graph Neural Networks
    Hu, Ziniu
    Dong, Yuxiao
    Wang, Kuansan
    Chang, Kai-Wei
    Sun, Yizhou
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1857 - 1867
  • [7] Neighborhood-enhanced contrast for pre-training graph neural networks
    Li, Yichun
    Huang, Jin
    Yu, Weihao
    Zhang, Tinghua
    [J]. NEURAL COMPUTING & APPLICATIONS, 2024, 36 (08): : 4195 - 4205
  • [8] Train Once and Explain Everywhere: Pre-training Interpretable Graph Neural Networks
    Yin, Jun
    Li, Chaozhuo
    Yan, Hao
    Lian, Jianxun
    Wang, Senzhang
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [9] Pre-Training Graph Neural Networks for Cold-Start Users and Items Representation
    Hao, Bowen
    Zhang, Jing
    Yin, Hongzhi
    Li, Cuiping
    Chen, Hong
    [J]. WSDM '21: PROCEEDINGS OF THE 14TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2021, : 265 - 273
  • [10] Continual Pre-Training of Language Models for Concept Prerequisite Learning with Graph Neural Networks
    Tang, Xin
    Liu, Kunjia
    Xu, Hao
    Xiao, Weidong
    Tan, Zhen
    [J]. MATHEMATICS, 2023, 11 (12)