Graph Barlow Twins: A self-supervised representation learning framework for graphs

被引:29
|
作者
Bielak, Piotr [1 ]
Kajdanowicz, Tomasz [1 ]
Chawla, Nitesh V. [2 ]
机构
[1] Wroclaw Univ Sci & Technol, Dept Artificial Intelligence, Wroclaw, Poland
[2] Univ Notre Dame, Dept Comp Sci & Engn, Notre Dame, IN USA
关键词
Representation learning; Self-supervised learning; Graph embedding;
D O I
10.1016/j.knosys.2022.109631
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The self-supervised learning (SSL) paradigm is an essential exploration area, which tries to eliminate the need for expensive data labeling. Despite the great success of SSL methods in computer vision and natural language processing, most of them employ contrastive learning objectives that require negative samples, which are hard to define. This becomes even more challenging in the case of graphs and is a bottleneck for achieving robust representations. To overcome such limitations, we propose a framework for self-supervised graph representation learning - Graph Barlow Twins, which utilizes a cross-correlation-based loss function instead of negative samples. Moreover, it does not rely on non-symmetric neural network architectures - in contrast to state-of-the-art self-supervised graph representation learning method BGRL. We show that our method achieves as competitive results as the best self-supervised methods and fully supervised ones while requiring fewer hyperparameters and substantially shorter computation time (ca. 30 times faster than BGRL). (c) The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/).
引用
收藏
页数:12
相关论文
共 50 条
  • [1] A Self-supervised Graph Autoencoder with Barlow Twins
    Li, Jingci
    Lu, Guangquan
    Li, Jiecheng
    [J]. PRICAI 2022: TRENDS IN ARTIFICIAL INTELLIGENCE, PT II, 2022, 13630 : 501 - 512
  • [2] Barlow Twins: Self-Supervised Learning via Redundancy Reduction
    Zbontar, Jure
    Jing, Li
    Misra, Ishan
    LeCun, Yann
    Deny, Stephane
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [3] Barlow Twins self-supervised learning for robust speaker recognition
    Mohammadamini, Mohammad
    Matrouf, Driss
    Bonastre, Jean-Francois
    Dowerah, Sandipana
    Serizel, Romain
    Jouvet, Denis
    [J]. INTERSPEECH 2022, 2022, : 4033 - 4037
  • [4] Self-supervised Representation Learning on Dynamic Graphs
    Tian, Sheng
    Wu, Ruofan
    Shi, Leilei
    Zhu, Liang
    Xiong, Tao
    [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 1814 - 1823
  • [5] Adaptive Self-Supervised Graph Representation Learning
    Gong, Yunchi
    [J]. 36TH INTERNATIONAL CONFERENCE ON INFORMATION NETWORKING (ICOIN 2022), 2022, : 254 - 259
  • [6] SimGRL: a simple self-supervised graph representation learning framework via triplets
    Da Huang
    Fangyuan Lei
    Xi Zeng
    [J]. Complex & Intelligent Systems, 2023, 9 : 5049 - 5062
  • [7] SimGRL: a simple self-supervised graph representation learning framework via triplets
    Huang, Da
    Lei, Fangyuan
    Zeng, Xi
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2023, 9 (05) : 5049 - 5062
  • [8] Self-supervised Consensus Representation Learning for Attributed Graph
    Liu, Changshu
    Wen, Liangjian
    Kang, Zhao
    Luo, Guangchun
    Tian, Ling
    [J]. PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 2654 - 2662
  • [9] Self-supervised Graph Representation Learning with Variational Inference
    Liao, Zihan
    Liang, Wenxin
    Liu, Han
    Mu, Jie
    Zhang, Xianchao
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2021, PT III, 2021, 12714 : 116 - 127
  • [10] Self-supervised graph representation learning via bootstrapping
    Che, Feihu
    Yang, Guohua
    Zhang, Dawei
    Tao, Jianhua
    Liu, Tong
    [J]. NEUROCOMPUTING, 2021, 456 : 88 - 96