Toward Graph Self-Supervised Learning With Contrastive Adjusted Zooming

被引:4
|
作者
Zheng, Yizhen [1 ]
Li, Ming [1 ,4 ]
Pan, Shirui [2 ]
Li, Yuan-Fang [1 ]
Peng, Hao [3 ]
Li, Ming [1 ,4 ]
Li, Zhao
机构
[1] Monash Univ, Dept Data Sci & AI, Fac IT, Clayton, Vic 3800, Australia
[2] Griffith Univ, Sch Informat & Commun Technol, Gold Coast, Qld 4222, Australia
[3] Beihang Univ, Beijing Adv Innovat Ctr Big Data & Brain Comp, Beijing 100191, Peoples R China
[4] Zhejiang Normal Univ, Key Lab Intelligent Educ Technol & Applicat Zheji, Jinhua 321004, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Contrastive learning; graph neural networks (GNNs); graph representation learning (GRL); self-supervised learning;
D O I
10.1109/TNNLS.2022.3216630
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph representation learning (GRL) is critical for graph-structured data analysis. However, most of the existing graph neural networks (GNNs) heavily rely on labeling information, which is normally expensive to obtain in the real world. Although some existing works aim to effectively learn graph representations in an unsupervised manner, they suffer from certain limitations, such as the heavy reliance on monotone contrastiveness and limited scalability. To overcome the aforementioned problems, in light of the recent advancements in graph contrastive learning, we introduce a novel self-supervised GRL algorithm via graph contrastive adjusted zooming, namely, G-Zoom, to learn node representations by leveraging the proposed adjusted zooming scheme. Specifically, this mechanism enables G-Zoom to explore and extract self-supervision signals from a graph from multiple scales: micro (i.e., node level), meso (i.e., neighborhood level), and macro (i.e., subgraph level). First, we generate two augmented views of the input graph via two different graph augmentations. Then, we establish three different contrastiveness on the above three scales progressively, from node, neighboring, to subgraph level, where we maximize the agreement between graph representations across scales. While we can extract valuable clues from a given graph on the micro and macro perspectives, the neighboring-level contrastiveness offers G-Zoom the capability of a customizable option based on our adjusted zooming scheme to manually choose an optimal viewpoint that lies between the micro and macro perspectives to better understand the graph data. In addition, to make our model scalable to large graphs, we use a parallel graph diffusion approach to decouple model training from the graph size. We have conducted extensive experiments on real-world datasets, and the results demonstrate that our proposed model outperforms the state-of-the-art methods consistently.
引用
收藏
页码:8882 / 8896
页数:15
相关论文
共 50 条
  • [1] Contrastive Self-supervised Learning for Graph Classification
    Zeng, Jiaqi
    Xie, Pengtao
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10824 - 10832
  • [2] JGCL: Joint Self-Supervised and Supervised Graph Contrastive Learning
    Akkas, Selahattin
    Azad, Ariful
    [J]. COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 1099 - 1105
  • [3] Generative and Contrastive Self-Supervised Learning for Graph Anomaly Detection
    Zheng, Yu
    Jin, Ming
    Liu, Yixin
    Chi, Lianhua
    Phan, Khoa T.
    Chen, Yi-Ping Phoebe
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (12) : 12220 - 12233
  • [4] Self-supervised Graph Contrastive Learning for Video Question Answering
    Yao, Xuan
    Gao, Jun-Yu
    Xu, Chang-Sheng
    [J]. Ruan Jian Xue Bao/Journal of Software, 2023, 34 (05): : 2083 - 2100
  • [5] Robust Hypergraph-Augmented Graph Contrastive Learning for Graph Self-Supervised Learning
    Wang, Zeming
    Li, Xiaoyang
    Wang, Rui
    Zheng, Changwen
    [J]. 2023 2ND ASIA CONFERENCE ON ALGORITHMS, COMPUTING AND MACHINE LEARNING, CACML 2023, 2023, : 287 - 293
  • [6] Toward Understanding the Feature Learning Process of Self-supervised Contrastive Learning
    Wen, Zixin
    Li, Yuanzhi
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [7] Self-supervised Graph-level Representation Learning with Adversarial Contrastive Learning
    Luo, Xiao
    Ju, Wei
    Gu, Yiyang
    Mao, Zhengyang
    Liu, Luchen
    Yuan, Yuhui
    Zhang, Ming
    [J]. ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2024, 18 (02)
  • [8] TCGL: Temporal Contrastive Graph for Self-Supervised Video Representation Learning
    Liu, Yang
    Wang, Keze
    Liu, Lingbo
    Lan, Haoyuan
    Lin, Liang
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2022, 31 : 1978 - 1993
  • [9] Federated Graph Anomaly Detection via Contrastive Self-Supervised Learning
    Kong, Xiangjie
    Zhang, Wenyi
    Wang, Hui
    Hou, Mingliang
    Chen, Xin
    Yan, Xiaoran
    Das, Sajal K.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024,
  • [10] Negative sampling strategies for contrastive self-supervised learning of graph representations
    Hafidi, Hakim
    Ghogho, Mounir
    Ciblat, Philippe
    Swami, Ananthram
    [J]. SIGNAL PROCESSING, 2022, 190