Enhanced Scalable Graph Neural Network via Knowledge Distillation

被引:0
|
作者
Mai, Chengyuan [1 ,2 ]
Chang, Yaomin [1 ,2 ]
Chen, Chuan [1 ,2 ]
Zheng, Zibin [2 ,3 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] Sun Yat Sen Univ, Natl Engn Res Ctr Digital Life, Guangzhou 510006, Peoples R China
[3] Sun Yat Sen Univ, Sch Software Engn, Zhuhai 519000, Peoples R China
基金
中国国家自然科学基金;
关键词
Graph neural networks; Scalability; Computational modeling; Convolution; Training; Spectral analysis; Data mining; Graph neural network (GNN); knowledge distillation (KD); network embedding; scalability;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Graph neural networks (GNNs) have achieved state-of-the-art performance in various graph representation learning scenarios. However, when applied to graph data in real world, GNNs have encountered scalability issues. Existing GNNs often have high computational load in both training and inference stages, making them incapable of meeting the performance needs of large-scale scenarios with a large number of nodes. Although several studies on scalable GNNs have developed, they either merely improve GNNs with limited scalability or come at the expense of reduced effectiveness. Inspired by knowledge distillation's (KDs) achievement in preserving performances while balancing scalability in computer vision and natural language processing, we propose an enhanced scalable GNN via KD (KD-SGNN) to improve the scalability and effectiveness of GNNs. On the one hand, KD-SGNN adopts the idea of decoupled GNNs, which decouples feature transformation and feature propagation in GNNs and leverages preprocessing techniques to improve the scalability of GNNs. On the other hand, KD-SGNN proposes two KD mechanisms (i.e., soft-target (ST) distillation and shallow imitation (SI) distillation) to improve the expressiveness. The scalability and effectiveness of KD-SGNN are evaluated on multiple real datasets. Besides, the effectiveness of the proposed KD mechanisms is also verified through comprehensive analyses.
引用
收藏
页码:1258 / 1271
页数:14
相关论文
共 50 条
  • [21] Online adversarial knowledge distillation for graph neural networks
    Wang, Can
    Wang, Zhe
    Chen, Defang
    Zhou, Sheng
    Feng, Yan
    Chen, Chun
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 237
  • [22] Adaptively Denoising Graph Neural Networks for Knowledge Distillation
    Guo, Yuxin
    Yang, Cheng
    Shi, Chuan
    Tu, Ke
    Wu, Zhengwei
    Zhang, Zhiqiang
    Zhou, Jun
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-RESEARCH TRACK AND DEMO TRACK, PT VIII, ECML PKDD 2024, 2024, 14948 : 253 - 269
  • [23] Graph Convolutional Neural Network for Intelligent Fault Diagnosis of Machines via Knowledge Graph
    Mao, Zehui
    Wang, Huan
    Jiang, Bin
    Xu, Juan
    Guo, Huifeng
    IEEE TRANSACTIONS ON INDUSTRIAL INFORMATICS, 2024, 20 (05) : 7862 - 7870
  • [24] Metro Traffic Flow Prediction via Knowledge Graph and Spatiotemporal Graph Neural Network
    Beijing Artificial Intelligence Institute, Faculty of Information Technology, Beijing University of Technology, Beijing
    100124, China
    不详
    266011, China
    不详
    100083, China
    J Adv Transp, 2022,
  • [25] Metro Traffic Flow Prediction via Knowledge Graph and Spatiotemporal Graph Neural Network
    Wang, Shun
    Lv, Yimei
    Peng, Yuan
    Piao, Xinglin
    Zhang, Yong
    JOURNAL OF ADVANCED TRANSPORTATION, 2022, 2022
  • [26] Cross-lingual Knowledge Graph Alignment via Graph Matching Neural Network
    Xu, Kun
    Wang, Liwei
    Yu, Mo
    Feng, Yansong
    Song, Yan
    Wang, Zhiguo
    Yu, Dong
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3156 - 3161
  • [27] SGKD: A Scalable and Effective Knowledge Distillation Framework for Graph Representation Learning
    He, Yufei
    Ma, Yao
    2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW, 2022, : 666 - 673
  • [28] A graph neural network-enhanced knowledge graph framework for intelligent analysis of policing cases
    Zhu, Hongqiang
    MATHEMATICAL BIOSCIENCES AND ENGINEERING, 2023, 20 (07) : 11585 - 11604
  • [29] Multi-view knowledge graph fusion via knowledge-aware attentional graph neural network
    Huang, Zhichao
    Li, Xutao
    Ye, Yunming
    Zhang, Baoquan
    Xu, Guangning
    Gan, Wensheng
    APPLIED INTELLIGENCE, 2023, 53 (04) : 3652 - 3671
  • [30] SMIGNN: social recommendation with multi-intention knowledge distillation based on graph neural network
    Yong Niu
    Xing Xing
    Zhichun Jia
    Mindong Xin
    Junye Xing
    The Journal of Supercomputing, 2024, 80 : 6965 - 6988