Online adversarial knowledge distillation for graph neural networks

被引:1
|
作者
Wang, Can [1 ]
Wang, Zhe [1 ]
Chen, Defang [1 ]
Zhou, Sheng [1 ]
Feng, Yan [1 ]
Chen, Chun [1 ]
机构
[1] Zhejiang Univ, Shanghai Inst Adv Study, Coll Comp Sci, ZJU Bangsun Joint Res Ctr, Hangzhou 310013, Zhejiang, Peoples R China
基金
中国国家自然科学基金;
关键词
Knowledge distillation; Graph neural networks; Dynamic graph; Online distillation; FRAMEWORK;
D O I
10.1016/j.eswa.2023.121671
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge distillation, a technique recently gaining popularity for enhancing model generalization in Convolutional Neural Networks (CNNs), operates under the assumption that both teacher and student models are trained on identical data distributions. However, its effect on Graph Neural Networks (GNNs) is less than satisfactory since the graph topology and node attributes are prone to evolve, thereby leading to the issue of distribution shift. In this paper, we tackle this challenge by simultaneously training a group of graph neural networks in an online distillation fashion, where the group knowledge plays a role as a dynamic virtual teacher and the structure changes in graph neural networks are effectively captured. To improve the distillation performance, two types of knowledge are transferred among the students to enhance each other: local knowledge reflecting information in the graph topology and node attributes, and global knowledge reflecting the prediction over classes. We transfer the global knowledge with KL-divergence as the vanilla knowledge distillation does, while exploiting the complicated structure of the local knowledge with an efficient adversarial cyclic learning framework. Extensive experiments verified the effectiveness of our proposed online adversarial distillation approach. The code is published at https://github.com/wangz3066/OnlineDistillGCN.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Compressing Deep Graph Neural Networks via Adversarial Knowledge Distillation
    He, Huarui
    Wang, Jie
    Zhang, Zhanqiu
    Wu, Feng
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 534 - 544
  • [2] On Representation Knowledge Distillation for Graph Neural Networks
    Joshi, Chaitanya K.
    Liu, Fayao
    Xun, Xu
    Lin, Jie
    Foo, Chuan Sheng
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4656 - 4667
  • [3] Graph-Free Knowledge Distillation for Graph Neural Networks
    Deng, Xiang
    Zhang, Zhongfei
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2321 - 2327
  • [4] Online cross-layer knowledge distillation on graph neural networks with deep supervision
    Jiongyu Guo
    Defang Chen
    Can Wang
    [J]. Neural Computing and Applications, 2023, 35 : 22359 - 22374
  • [5] Online cross-layer knowledge distillation on graph neural networks with deep supervision
    Guo, Jiongyu
    Chen, Defang
    Wang, Can
    [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (30): : 22359 - 22374
  • [6] RELIANT: Fair Knowledge Distillation for Graph Neural Networks
    Dong, Yushun
    Zhang, Binchi
    Yuan, Yiling
    Zou, Na
    Wang, Qi
    Li, Jundong
    [J]. PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +
  • [7] Knowledge Distillation Improves Graph Structure Augmentation for Graph Neural Networks
    Wu, Lirong
    Lin, Haitao
    Huang, Yufei
    Li, Stan Z.
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [8] Accelerating Molecular Graph Neural Networks via Knowledge Distillation
    Kelvinius, Filip Ekstrom
    Georgiev, Dimitar
    Toshev, Artur Petrov
    Gasteiger, Johannes
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36, NEURIPS 2023, 2023,
  • [9] Knowledge Distillation with Graph Neural Networks for Epileptic Seizure Detection
    Zheng, Qinyue
    Venkitaraman, Arun
    Petravic, Simona
    Frossard, Pascal
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 547 - 563
  • [10] Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks
    Yang, Chenxiao
    Wu, Qitian
    Yan, Junchi
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,