共 50 条
- [1] Boosting Adaptive Graph Augmented MLPs via Customized Knowledge Distillation [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: RESEARCH TRACK, ECML PKDD 2023, PT III, 2023, 14171 : 87 - 103
- [2] Accelerating Molecular Graph Neural Networks via Knowledge Distillation [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36, NEURIPS 2023, 2023,
- [3] Double Wins: Boosting Accuracy and Fifficiency of Graph Neural Networks by Reliable Knowledge Distillation [J]. 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING, ICDM 2023, 2023, : 1343 - 1348
- [4] Compressing Deep Graph Neural Networks via Adversarial Knowledge Distillation [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 534 - 544
- [8] Graph-Free Knowledge Distillation for Graph Neural Networks [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2321 - 2327
- [9] RELIANT: Fair Knowledge Distillation for Graph Neural Networks [J]. PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +