共 50 条
- [2] Accelerating Molecular Graph Neural Networks via Knowledge Distillation [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36, NEURIPS 2023, 2023,
- [3] Boosting Graph Neural Networks via Adaptive Knowledge Distillation [J]. THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 7793 - 7801
- [6] Compressing deep graph convolution network with multi-staged knowledge distillation [J]. PLOS ONE, 2021, 16 (08):
- [7] Graph-Free Knowledge Distillation for Graph Neural Networks [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2321 - 2327
- [8] Online cross-layer knowledge distillation on graph neural networks with deep supervision [J]. Neural Computing and Applications, 2023, 35 : 22359 - 22374
- [9] RELIANT: Fair Knowledge Distillation for Graph Neural Networks [J]. PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +
- [10] Online cross-layer knowledge distillation on graph neural networks with deep supervision [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (30): : 22359 - 22374