Prototypical Graph Contrastive Learning

被引:40
|
作者
Lin, Shuai [1 ]
Liu, Chen [1 ]
Zhou, Pan [2 ]
Hu, Zi-Yuan [3 ]
Wang, Shuojia [4 ]
Zhao, Ruihui [4 ]
Zheng, Yefeng [4 ]
Lin, Liang [3 ]
Xing, Eric [5 ]
Liang, Xiaodan [1 ]
机构
[1] Sun Yat Sen Univ, Sch Intelligent Syst Engn, Shenzhen 518107, Peoples R China
[2] Sea AI Lab, Singapore 138522, Singapore
[3] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[4] Tencent Jarvis Lab, Shenzhen 518000, Peoples R China
[5] Mohamed Bin Zayed Univ Artificial Intelligence, Sch Comp Sci, Abu Dhabi, U Arab Emirates
基金
中国国家自然科学基金;
关键词
Task analysis; Prototypes; Semantics; Representation learning; Kernel; Perturbation methods; Loss measurement; Contrastive learning; graph representation learning; self-supervised learning; NETWORKS;
D O I
10.1109/TNNLS.2022.3191086
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph-level representations are critical in various real-world applications, such as predicting the properties of molecules. However, in practice, precise graph annotations are generally very expensive and time-consuming. To address this issue, graph contrastive learning constructs an instance discrimination task, which pulls together positive pairs (augmentation pairs of the same graph) and pushes away negative pairs (augmentation pairs of different graphs) for unsupervised representation learning. However, since for a query, its negatives are uniformly sampled from all graphs, existing methods suffer from the critical sampling bias issue, i.e., the negatives likely having the same semantic structure with the query, leading to performance degradation. To mitigate this sampling bias issue, in this article, we propose a prototypical graph contrastive learning (PGCL) approach. Specifically, PGCL models the underlying semantic structure of the graph data via clustering semantically similar graphs into the same group and simultaneously encourages the clustering consistency for different augmentations of the same graph. Then, given a query, it performs negative sampling via drawing the graphs from those clusters that differ from the cluster of query, which ensures the semantic difference between query and its negative samples. Moreover, for a query, PGCL further reweights its negative samples based on the distance between their prototypes (cluster centroids) and the query prototype such that those negatives having moderate prototype distance enjoy relatively large weights. This reweighting strategy is proven to be more effective than uniform sampling. Experimental results on various graph benchmarks testify the advantages of our PGCL over state-of-the-art methods. The code is publicly available at https://github.com/ha-lins/PGCL.
引用
收藏
页码:2747 / 2758
页数:12
相关论文
共 50 条
  • [1] Graph prototypical contrastive learning
    Peng, Meixin
    Juan, Xin
    Li, Zhanshan
    [J]. INFORMATION SCIENCES, 2022, 612 : 816 - 834
  • [2] X-GOAL: Multiplex Heterogeneous Graph Prototypical Contrastive Learning
    Jing, Baoyu
    Feng, Shengyu
    Xiang, Yuejia
    Chen, Xi
    Chen, Yu
    Tong, Hanghang
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 894 - 904
  • [3] Prototypical contrastive learning for image classification
    Yang, Han
    Li, Jun
    [J]. CLUSTER COMPUTING-THE JOURNAL OF NETWORKS SOFTWARE TOOLS AND APPLICATIONS, 2024, 27 (02): : 2059 - 2069
  • [4] Prototypical contrastive learning for image classification
    Han Yang
    Jun Li
    [J]. Cluster Computing, 2024, 27 : 2059 - 2069
  • [5] Multi-level Graph Contrastive Prototypical Clustering
    Zhang, Yuchao
    Yuan, Yuan
    Wang, Qi
    [J]. PROCEEDINGS OF THE THIRTY-SECOND INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2023, 2023, : 4611 - 4619
  • [6] Deep single-cell RNA-seq data clustering with graph prototypical contrastive learning
    Lee, Junseok
    Kim, Sungwon
    Hyun, Dongmin
    Lee, Namkyeong
    Kim, Yejin
    Park, Chanyoung
    [J]. BIOINFORMATICS, 2023, 39 (06)
  • [7] Learning from Label Proportions with Prototypical Contrastive Clusterin
    Cue La Rosa, Laura Elena
    Borges Oliveira, Dario Augusto
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 2153 - 2161
  • [8] Prototypical Contrastive Learning for Domain Adaptive Semantic Segmentation
    Liu, Quansheng
    Pu, Chengdao
    Gao, Fang
    Yu, Jun
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [9] Prototypical Contrastive Transfer Learning for Multimodal Language Understanding
    Otsuki, Seitaro
    Ishikawa, Shintaro
    Sugiura, Komei
    [J]. 2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, IROS, 2023, : 25 - 32
  • [10] Asymmetric Graph Contrastive Learning
    Chang, Xinglong
    Wang, Jianrong
    Guo, Rui
    Wang, Yingkui
    Li, Weihao
    [J]. MATHEMATICS, 2023, 11 (21)