Distill2Vec: Dynamic Graph Representation Learning with Knowledge Distillation

被引:1
|
作者
Antaris, Stefanos [1 ]
Rafailidis, Dimitrios [2 ]
机构
[1] KTH Royal Inst Technol, Hive Streaming AB, Stockholm, Sweden
[2] Maastricht Univ, Maastricht, Netherlands
来源
2020 IEEE/ACM INTERNATIONAL CONFERENCE ON ADVANCES IN SOCIAL NETWORKS ANALYSIS AND MINING (ASONAM) | 2020年
关键词
Dynamic graph representation learning; knowledge distillation; model compression;
D O I
10.1109/ASONAM49781.2020.9381315
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Dynamic graph representation learning strategies are based on different neural architectures to capture the graph evolution over time. However, the underlying neural architectures require a large amount of parameters to train and suffer from high online inference latency, that is several model parameters have to be updated when new data arrive online. In this study we propose Distill2Vec, a knowledge distillation strategy to train a compact model with a low number of trainable parameters, so as to reduce the latency of online inference and maintain the model accuracy high. We design a distillation loss function based on Kullback-Leibler divergence to transfer the acquired knowledge from a teacher model trained on offline data, to a small-size student model for online data. Our experiments with publicly available datasets show the superiority of our proposed model over several state-of-the-art approaches with relative gains up to 5% in the link prediction task. In addition, we demonstrate the effectiveness of our knowledge distillation strategy, in terms of number of required parameters, where Distill2Vec achieves a compression ratio up to 7:100 when compared with baseline approaches. For reproduction purposes, our implementation is publicly available at https://stefanosantaris.github.io/Distill2Vec.
引用
收藏
页码:60 / 64
页数:5
相关论文
共 50 条
  • [1] Representation Learning for Knowledge graph with Dynamic Margin
    Luo, Yiqin
    Chang, Liang
    Rao, Guanjun
    Chen, Wei
    Gu, Tianlong
    2018 11TH INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN (ISCID), VOL 2, 2018, : 305 - 308
  • [2] dyngraph2vec: Capturing network dynamics using dynamic graph representation learning
    Goyal, Palash
    Chhetri, Sujit Rokka
    Canedo, Arquimedes
    KNOWLEDGE-BASED SYSTEMS, 2020, 187
  • [3] Cascade2vec: Learning Dynamic Cascade Representation by Recurrent Graph Neural Networks
    Huang, Zhenhua
    Wang, Zhenyu
    Zhang, Rui
    IEEE ACCESS, 2019, 7 : 144800 - 144812
  • [4] SGKD: A Scalable and Effective Knowledge Distillation Framework for Graph Representation Learning
    He, Yufei
    Ma, Yao
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW, 2022, : 666 - 673
  • [5] Distill on the Go: Online knowledge distillation in self-supervised learning
    Bhat, Prashant
    Arani, Elahe
    Zonooz, Bahram
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION WORKSHOPS, CVPRW 2021, 2021, : 2672 - 2681
  • [6] On Representation Knowledge Distillation for Graph Neural Networks
    Joshi, Chaitanya K.
    Liu, Fayao
    Xun, Xu
    Lin, Jie
    Foo, Chuan Sheng
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4656 - 4667
  • [7] Cross-Modal Graph Knowledge Representation and Distillation Learning for Land Cover Classification
    Wang, Wenzhen
    Liu, Fang
    Liao, Wenzhi
    Xiao, Liang
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2023, 61
  • [8] Distill-DBDGAN: Knowledge Distillation and Adversarial Learning Framework for Defocus Blur Detection
    Jonna, Sankaraganesh
    Medhi, Moushumi
    Sahay, Rajiv Ranjan
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2023, 19 (02)
  • [9] Things2Vec: Semantic Modeling in the Internet of Things With Graph Representation Learning
    Hu, Liang
    Wu, Gang
    Xing, Yongheng
    Wang, Feng
    IEEE INTERNET OF THINGS JOURNAL, 2020, 7 (03) : 1939 - 1948
  • [10] Temporal Knowledge Graph Reasoning Based on Dynamic Fusion Representation Learning
    Chen, Hongwei
    Zhang, Man
    Chen, Zexi
    EXPERT SYSTEMS, 2025, 42 (02)