Neighborhood-enhanced contrast for pre-training graph neural networks

被引:0
|
作者
Yichun Li
Jin Huang
Weihao Yu
Tinghua Zhang
机构
[1] South China Normal University,School of Artificial Intelligence
[2] South China Normal University,School of Computer Science
[3] Research Institute of China Telecom Corporate Ltd.,undefined
[4] China Electronic Product Reliability and Environmental Testing Research Institute,undefined
来源
关键词
Pre-training graph neural networks; Contrastive learning; Neighborhood; Graph structure; Semantic space;
D O I
暂无
中图分类号
学科分类号
摘要
Pre-training graph neural networks (GNNs) have been proposed to promote graph-related downstream tasks, such as link prediction and node classification. Most existing works employ contrastive learning to explore graph characteristics by enforcing positive sample pairs to be close and negative sample pairs to be distant after performing data augmentation on the input graph. However, these methods apply random operations on input data in data augmentation and sample pairs construction, which leads to neglecting central nodes and the neighbor relationship between nodes. To address the corresponding problem, we propose a novel framework for pre-training GNNs, named Neighborhood-Enhanced Contrast for Pre-Training Graph Neural Networks (NECPT). Specifically, we propose data augmentation strategy based on node centrality to preserve central nodes and corresponding edges, which is used to generate two semantically similar views from input graph. Notably, our NECPT constructs sample pairs by integrating the potential node neighbors in graph structure and semantic space to explore general graph regularities. After generating node representations with GNN encoders and multilayer perceptrons, contrastive sample pairs are selected from different node neighbors, which combines diverse neighborhood relations into contrastive learning. Finally, node representations obtained from the model are used to predict the attributes of nodes and edges, which extracts deep semantic connections between attribute and structure information. Extensive experiments on benchmark datasets in biology and chemistry demonstrate the effectiveness of our proposed approach.
引用
收藏
页码:4195 / 4205
页数:10
相关论文
共 50 条
  • [1] Neighborhood-enhanced contrast for pre-training graph neural networks
    Li, Yichun
    Huang, Jin
    Yu, Weihao
    Zhang, Tinghua
    [J]. NEURAL COMPUTING & APPLICATIONS, 2024, 36 (08): : 4195 - 4205
  • [2] Pre-training on dynamic graph neural networks
    Chen, Ke-Jia
    Zhang, Jiajun
    Jiang, Linpu
    Wang, Yunyun
    Dai, Yuxuan
    [J]. NEUROCOMPUTING, 2022, 500 : 679 - 687
  • [3] Pre-training graph neural networks for link prediction in biomedical networks
    Long, Yahui
    Wu, Min
    Liu, Yong
    Fang, Yuan
    Kwoh, Chee Keong
    Chen, Jinmiao
    Luo, Jiawei
    Li, Xiaoli
    [J]. BIOINFORMATICS, 2022, 38 (08) : 2254 - 2262
  • [4] GPPT: Graph Pre-training and Prompt Tuning to Generalize Graph Neural Networks
    Sun, Mingchen
    Zhou, Kaixiong
    He, Xin
    Wang, Ying
    Wang, Xin
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 1717 - 1727
  • [5] GPT-GNN: Generative Pre-Training of Graph Neural Networks
    Hu, Ziniu
    Dong, Yuxiao
    Wang, Kuansan
    Chang, Kai-Wei
    Sun, Yizhou
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1857 - 1867
  • [6] Train Once and Explain Everywhere: Pre-training Interpretable Graph Neural Networks
    Yin, Jun
    Li, Chaozhuo
    Yan, Hao
    Lian, Jianxun
    Wang, Senzhang
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [7] Graph Neural Pre-training for Recommendation with Side Information
    Liu, Siwei
    Meng, Zaiqiao
    Macdonald, Craig
    Ounis, Iadh
    [J]. ACM TRANSACTIONS ON INFORMATION SYSTEMS, 2023, 41 (03)
  • [8] GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training
    Qiu, Jiezhong
    Chen, Qibin
    Dong, Yuxiao
    Zhang, Jing
    Yang, Hongxia
    Ding, Ming
    Wang, Kuansan
    Tang, Jie
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1150 - 1160
  • [9] Pre-Training Graph Neural Networks for Cold-Start Users and Items Representation
    Hao, Bowen
    Zhang, Jing
    Yin, Hongzhi
    Li, Cuiping
    Chen, Hong
    [J]. WSDM '21: PROCEEDINGS OF THE 14TH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING, 2021, : 265 - 273
  • [10] Continual Pre-Training of Language Models for Concept Prerequisite Learning with Graph Neural Networks
    Tang, Xin
    Liu, Kunjia
    Xu, Hao
    Xiao, Weidong
    Tan, Zhen
    [J]. MATHEMATICS, 2023, 11 (12)