Sub-graph Contrast for Scalable Self-Supervised Graph Representation Learning

被引:75
|
作者
Jiao, Yizhu [1 ]
Xiong, Yun [1 ,2 ]
Zhang, Jiawei [3 ]
Zhang, Yao [1 ]
Zhang, Tianqi [1 ]
Zhu, Yangyong [1 ,2 ]
机构
[1] Fudan Univ, Sch Comp Sci, Shanghai Key Lab Data Sci, Shanghai, Peoples R China
[2] Fudan Univ, Shanghai Inst Adv Commun & Data Sci, Shanghai, Peoples R China
[3] Florida State Univ, Dept Comp Sci, IFM Lab, Tallahassee, FL 32306 USA
基金
中国国家自然科学基金;
关键词
Self-Supervised Learning; Graph Representation Learning; Subgraph Contrast; Graph Neural Networks;
D O I
10.1109/ICDM50108.2020.00031
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph representation learning has attracted lots of attention recently. Existing graph neural networks fed with the complete graph data are not scalable due to limited computation and memory costs. Thus, it remains a great challenge to capture rich information in large-scale graph data. Besides, these methods mainly focus on supervised learning and highly depend on node label information, which is expensive to obtain in the real world. As to unsupervised network embedding approaches, they overemphasize node proximity instead, whose learned representations can hardly be used in downstream application tasks directly. In recent years, emerging self-supervised learning provides a potential solution to address the aforementioned problems. However, existing self-supervised works also operate on the complete graph data and are biased to fit either global or very local (1-hop neighborhood) graph structures in defining the mutual information based loss terms. In this paper, a novel self-supervised representation learning method via Sub-graph Contrast, namely SUBG-CON, is proposed by utilizing the strong correlation between central nodes and their sampled subgraphs to capture regional structure information. Instead of learning on the complete input graph data, with a novel data augmentation strategy, SUBG-CON learns node representations through a contrastive loss defined based on subgraphs sampled from the original graph instead. Compared with existing graph representation learning approaches, SUBG-CON has prominent performance advantages in weaker supervision requirements, model learning scalability, and parallelization. Extensive experiments verify both the effectiveness and the efficiency of our work compared with both classic and state-of-the-art graph representation learning approaches on multiple real world large-scale benchmark datasets from different domains.
引用
收藏
页码:222 / 231
页数:10
相关论文
共 50 条
  • [1] Generative Subgraph Contrast for Self-Supervised Graph Representation Learning
    Han, Yuehui
    Hui, Le
    Jiang, Haobo
    Qian, Jianjun
    Xie, Jin
    [J]. COMPUTER VISION - ECCV 2022, PT XXX, 2022, 13690 : 91 - 107
  • [2] Adaptive Self-Supervised Graph Representation Learning
    Gong, Yunchi
    [J]. 36TH INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING (ICOIN 2022), 2022, : 254 - 259
  • [3] Scalable self-supervised graph representation learning via enhancing and contrasting subgraphs
    Yizhu Jiao
    Yun Xiong
    Jiawei Zhang
    Yao Zhang
    Tianqi Zhang
    Yangyong Zhu
    [J]. Knowledge and Information Systems, 2022, 64 : 235 - 260
  • [4] Scalable self-supervised graph representation learning via enhancing and contrasting subgraphs
    Jiao, Yizhu
    Xiong, Yun
    Zhang, Jiawei
    Zhang, Yao
    Zhang, Tianqi
    Zhu, Yangyong
    [J]. KNOWLEDGE AND INFORMATION SYSTEMS, 2022, 64 (01) : 235 - 260
  • [5] CLEAR: Cluster-Enhanced Contrast for Self-Supervised Graph Representation Learning
    Luo, Xiao
    Ju, Wei
    Qu, Meng
    Gu, Yiyang
    Chen, Chong
    Deng, Minghua
    Hua, Xian-Sheng
    Zhang, Ming
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (01) : 899 - 912
  • [6] Self-Supervised Dynamic Graph Representation Learning via Temporal Subgraph Contrast
    Chen, Ke-Jia
    Liu, Linsong
    Jiang, Linpu
    Chen, Jingqiang
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 18 (01)
  • [7] Self-supervised Consensus Representation Learning for Attributed Graph
    Liu, Changshu
    Wen, Liangjian
    Kang, Zhao
    Luo, Guangchun
    Tian, Ling
    [J]. PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 2654 - 2662
  • [8] Self-supervised Graph Representation Learning with Variational Inference
    Liao, Zihan
    Liang, Wenxin
    Liu, Han
    Mu, Jie
    Zhang, Xianchao
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2021, PT III, 2021, 12714 : 116 - 127
  • [9] Self-supervised graph representation learning via bootstrapping
    Che, Feihu
    Yang, Guohua
    Zhang, Dawei
    Tao, Jianhua
    Liu, Tong
    [J]. NEUROCOMPUTING, 2021, 456 (456) : 88 - 96
  • [10] Simple Self-supervised Multiplex Graph Representation Learning
    Mo, Yujie
    Chen, Yuhuan
    Peng, Liang
    Shi, Xiaoshuang
    Zhu, Xiaofeng
    [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2022, 2022, : 3301 - 3309