GPT-GNN: Generative Pre-Training of Graph Neural Networks

被引:238
|
作者
Hu, Ziniu [1 ]
Dong, Yuxiao [2 ]
Wang, Kuansan [2 ]
Chang, Kai-Wei [1 ]
Sun, Yizhou [1 ]
机构
[1] Univ Calif Los Angeles, Los Angeles, CA 90024 USA
[2] Microsoft Res, Redmond, WA USA
基金
美国国家科学基金会;
关键词
Generative Pre-Training; Graph Neural Networks; Graph Representation Learning; Network Embedding; GNN Pre-Training;
D O I
10.1145/3394486.3403237
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have been demonstrated to be powerful in modeling graph-structured data. However, training GNNs usually requires abundant task-specific labeled data, which is often arduously expensive to obtain. One effective way to reduce the labeling effort is to pre-train an expressive GNN model on unlabeled data with self-supervision and then transfer the learned model to downstream tasks with only a few labels. In this paper, we present the GPT-GNN* framework to initialize GNNs by generative pre-training. GPT-GNN introduces a self-supervised attributed graph generation task to pre-train a GNN so that it can capture the structural and semantic properties of the graph. We factorize the likelihood of the graph generation into two components: 1) Attribute Generation and 2) Edge Generation. By modeling both components, GPT-GNN captures the inherent dependency between node attributes and graph structure during the generative process. Comprehensive experiments on the billion-scale Open Academic Graph and Amazon recommendation data demonstrate that GPT-GNN significantly outperforms state-of-the-art GNN models without pre-training by up to 9.1% across various downstream tasks.
引用
收藏
页码:1857 / 1867
页数:11
相关论文
共 50 条
  • [1] GPT-ST: Generative Pre-Training of Spatio-Temporal Graph Neural Networks
    Li, Zhonghang
    Xia, Lianghao
    Xu, Yong
    Huang, Chao
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [2] Pre-training on dynamic graph neural networks
    Chen, Ke-Jia
    Zhang, Jiajun
    Jiang, Linpu
    Wang, Yunyun
    Dai, Yuxuan
    [J]. NEUROCOMPUTING, 2022, 500 : 679 - 687
  • [3] Pre-training graph neural networks for link prediction in biomedical networks
    Long, Yahui
    Wu, Min
    Liu, Yong
    Fang, Yuan
    Kwoh, Chee Keong
    Chen, Jinmiao
    Luo, Jiawei
    Li, Xiaoli
    [J]. BIOINFORMATICS, 2022, 38 (08) : 2254 - 2262
  • [4] GPPT: Graph Pre-training and Prompt Tuning to Generalize Graph Neural Networks
    Sun, Mingchen
    Zhou, Kaixiong
    He, Xin
    Wang, Ying
    Wang, Xin
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1717 - 1727
  • [5] Neighborhood-enhanced contrast for pre-training graph neural networks
    Yichun Li
    Jin Huang
    Weihao Yu
    Tinghua Zhang
    [J]. Neural Computing and Applications, 2024, 36 : 4195 - 4205
  • [6] Neighborhood-enhanced contrast for pre-training graph neural networks
    Li, Yichun
    Huang, Jin
    Yu, Weihao
    Zhang, Tinghua
    [J]. NEURAL COMPUTING & APPLICATIONS, 2024, 36 (08): : 4195 - 4205
  • [7] Self-supervised graph neural network with pre-training generative learning for recommendation systems
    Min, Xin
    Li, Wei
    Yang, Jinzhao
    Xie, Weidong
    Zhao, Dazhe
    [J]. SCIENTIFIC REPORTS, 2022, 12 (01)
  • [8] Self-supervised graph neural network with pre-training generative learning for recommendation systems
    Xin Min
    Wei Li
    Jinzhao Yang
    Weidong Xie
    Dazhe Zhao
    [J]. Scientific Reports, 12
  • [9] Train Once and Explain Everywhere: Pre-training Interpretable Graph Neural Networks
    Yin, Jun
    Li, Chaozhuo
    Yan, Hao
    Lian, Jianxun
    Wang, Senzhang
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [10] Graph Neural Pre-training for Recommendation with Side Information
    Liu, Siwei
    Meng, Zaiqiao
    Macdonald, Craig
    Ounis, Iadh
    [J]. ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (03)