Multi-level graph contrastive learning

被引:2
|
作者
Shao, Pengpeng [1 ]
Tao, Jianhua [1 ,2 ]
机构
[1] Tsinghua Univ, Dept Automat, BNRIST, Beijing, Peoples R China
[2] Tsinghua Univ, Beijing Natl Res Ctr Informat Sci & Technol, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph representation learning; Self-supervised learning; Contrastive learning;
D O I
10.1016/j.neucom.2023.127101
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph representation learning has attracted a surge of interest recently, which targets learning discriminant representation for each node in the graph. Most of these representation methods focus on supervised learning and heavily depend on label information. However, annotating graphs are expensive in the real-world, especially in specialized domains (i.e. biology), as it requires the annotators with the domain knowledge to label the graph. To approach this problem, self-supervised learning provides a feasible solution for graph representation learning. In this paper, we propose a Multi-Level Graph Contrastive Learning (MLGCL) framework for learning robust representation of graph data by contrasting space views of graphs. Specifically, we introduce a novel contrastive view - space view. The original graph is a first-order approximation structure in the topological space where nodes are linked by feature similarity, relationship, etc. While the k-nearest neighbor (kNN) graph with community structure generated by encoding features preserves high-order proximity in feature space, it not only provides a complementary graph to the original graph from the feature space view but also is suitable for GNNs encoder. Furthermore, we develop a multi-level contrastive mode to preserve the local similarity and semantic similarity of graph-structured data simultaneously. Extensive experiments indicate MLGCL achieves promising results compared with the existing state-of-the-art graph representation learning methods on seven node classification datasets and three graph classification datasets.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Towards Spoken Language Understanding via Multi-level Multi-grained Contrastive Learning
    Cheng, Xuxin
    Xu, Wanshi
    Zhu, Zhihong
    Li, Hongxiang
    Zou, Yuexian
    [J]. PROCEEDINGS OF THE 32ND ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2023, 2023, : 326 - 336
  • [22] MLGAL: Multi-level Label Graph Adaptive Learning for node clustering in the attributed graph
    Yu, Jiajun
    Jia, Adele Lu
    [J]. KNOWLEDGE-BASED SYSTEMS, 2023, 278
  • [23] Multi-level Shared Knowledge Guided Learning for Knowledge Graph Completion
    Shan, Yongxue
    Zhou, Jie
    Peng, Jie
    Zhou, Xin
    Yin, Jiaqian
    Wang, Xiaodong
    [J]. TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2024, 12 : 1027 - 1042
  • [24] Deep learning and multi-level featurization of graph representations of microstructural data
    Jones, Reese
    Safta, Cosmin
    Frankel, Ari
    [J]. COMPUTATIONAL MECHANICS, 2023, 72 (01) : 57 - 75
  • [25] Efficient Fair Graph Representation Learning Using a Multi-level Framework
    He, Yuntian
    Gurukar, Saket
    Parthasarathy, Srinivasan
    [J]. COMPANION OF THE WORLD WIDE WEB CONFERENCE, WWW 2023, 2023, : 298 - 301
  • [26] Deep learning and multi-level featurization of graph representations of microstructural data
    Reese Jones
    Cosmin Safta
    Ari Frankel
    [J]. Computational Mechanics, 2023, 72 : 57 - 75
  • [27] Multi-level cross-modal contrastive learning for review-aware recommendation
    Wei, Yibiao
    Xu, Yang
    Zhu, Lei
    Ma, Jingwei
    Peng, Chengmei
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 247
  • [28] Multi-level sequence denoising with cross-signal contrastive learning for sequential recommendation
    Zhu, Xiaofei
    Li, Liang
    Liu, Weidong
    Luo, Xin
    [J]. Neural Networks, 2024, 179
  • [29] Multi-level graph layout on the GPU
    Frishman, Yaniv
    Tal, Ayellet
    [J]. IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2007, 13 (06) : 1310 - 1317
  • [30] MucLiPred: Multi-Level Contrastive Learning for Predicting Nucleic Acid Binding Residues of Proteins
    Zhang, Jiashuo
    Wang, Ruheng
    Wei, Leyi
    [J]. JOURNAL OF CHEMICAL INFORMATION AND MODELING, 2024, 64 (03) : 1050 - 1065