Boosting Graph Neural Networks via Adaptive Knowledge Distillation

被引:0
|
作者
Guo, Zhichun [1 ]
Zhang, Chunhui [2 ]
Fan, Yujie [3 ]
Tian, Yijun [1 ]
Zhang, Chuxu [2 ]
Chawla, Nitesh V. [1 ]
机构
[1] Univ Notre Dame, Notre Dame, IN 46556 USA
[2] Brandeis Univ, Waltham, MA 02453 USA
[3] Case Western Reserve Univ, Cleveland, OH 44106 USA
基金
美国国家科学基金会;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have shown remarkable performance on diverse graph mining tasks. While sharing the same message passing framework, our study shows that different GNNs learn distinct knowledge from the same graph. This implies potential performance improvement by distilling the complementary knowledge from multiple models. However, knowledge distillation (KD) transfers knowledge from high-capacity teachers to a lightweight student, which deviates from our scenario: GNNs are often shallow. To transfer knowledge effectively, we need to tackle two challenges: how to transfer knowledge from compact teachers to a student with the same capacity; and, how to exploit student GNN's own learning ability. In this paper, we propose a novel adaptive KD framework, called BGNN, which sequentially transfers knowledge from multiple GNNs into a student GNN. We also introduce an adaptive temperature module and a weight boosting module. These modules guide the student to the appropriate knowledge for effective learning. Extensive experiments have demonstrated the effectiveness of BGNN. In particular, we achieve up to 3.05% improvement for node classification and 6.35% improvement for graph classification over vanilla GNNs.
引用
收藏
页码:7793 / 7801
页数:9
相关论文
共 50 条
  • [1] Boosting Adaptive Graph Augmented MLPs via Customized Knowledge Distillation
    Wei, Shaowei
    Wu, Zhengwei
    Zhang, Zhiqiang
    Zhou, Jun
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT III, 2023, 14171 : 87 - 103
  • [2] Accelerating Molecular Graph Neural Networks via Knowledge Distillation
    Kelvinius, Filip Ekstrom
    Georgiev, Dimitar
    Toshev, Artur Petrov
    Gasteiger, Johannes
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36, NEURIPS 2023, 2023,
  • [3] Double Wins: Boosting Accuracy and Fifficiency of Graph Neural Networks by Reliable Knowledge Distillation
    Tan, Qiaoyu
    Zhu, Daochen
    Liu, Ninghao
    Choi, Soo-Hyun
    Li, Li
    Chen, Rui
    Hu, Xia
    [J]. 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 1343 - 1348
  • [4] Compressing Deep Graph Neural Networks via Adversarial Knowledge Distillation
    He, Huarui
    Wang, Jie
    Zhang, Zhanqiu
    Wu, Feng
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 534 - 544
  • [5] EGNN: Constructing explainable graph neural networks via knowledge distillation
    Li, Yuan
    Liu, Li
    Wang, Guoyin
    Du, Yong
    Chen, Penggang
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 241
  • [6] On Representation Knowledge Distillation for Graph Neural Networks
    Joshi, Chaitanya K.
    Liu, Fayao
    Xun, Xu
    Lin, Jie
    Foo, Chuan Sheng
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (04) : 4656 - 4667
  • [7] Knowledge distillation via adaptive meta-learning for graph neural network
    Shen, Tiesunlong
    Wang, Jin
    Zhang, Xuejie
    [J]. INFORMATION SCIENCES, 2025, 689
  • [8] Graph-Free Knowledge Distillation for Graph Neural Networks
    Deng, Xiang
    Zhang, Zhongfei
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2321 - 2327
  • [9] RELIANT: Fair Knowledge Distillation for Graph Neural Networks
    Dong, Yushun
    Zhang, Binchi
    Yuan, Yiling
    Zou, Na
    Wang, Qi
    Li, Jundong
    [J]. PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +
  • [10] Online adversarial knowledge distillation for graph neural networks
    Wang, Can
    Wang, Zhe
    Chen, Defang
    Zhou, Sheng
    Feng, Yan
    Chen, Chun
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237