共 50 条
- [2] Graph-Free Knowledge Distillation for Graph Neural Networks [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2321 - 2327
- [3] Knowledge Distillation Improves Graph Structure Augmentation for Graph Neural Networks [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [4] RELIANT: Fair Knowledge Distillation for Graph Neural Networks [J]. PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +
- [6] A New Concept for Explaining Graph Neural Networks [J]. NESY 2021: NEURAL-SYMBOLIC LEARNING AND REASONING, 2021, 2986 : 1 - 5
- [7] Accelerating Molecular Graph Neural Networks via Knowledge Distillation [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36, NEURIPS 2023, 2023,
- [8] Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [9] Knowledge Distillation with Graph Neural Networks for Epileptic Seizure Detection [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 547 - 563
- [10] Boosting Graph Neural Networks via Adaptive Knowledge Distillation [J]. THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 6, 2023, : 7793 - 7801