共 50 条
- [1] Compressing Deep Graph Neural Networks via Adversarial Knowledge Distillation [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 534 - 544
- [3] Graph-Free Knowledge Distillation for Graph Neural Networks [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 2321 - 2327
- [4] Online cross-layer knowledge distillation on graph neural networks with deep supervision [J]. Neural Computing and Applications, 2023, 35 : 22359 - 22374
- [5] Online cross-layer knowledge distillation on graph neural networks with deep supervision [J]. NEURAL COMPUTING & APPLICATIONS, 2023, 35 (30): : 22359 - 22374
- [6] RELIANT: Fair Knowledge Distillation for Graph Neural Networks [J]. PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +
- [7] Knowledge Distillation Improves Graph Structure Augmentation for Graph Neural Networks [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
- [8] Accelerating Molecular Graph Neural Networks via Knowledge Distillation [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36, NEURIPS 2023, 2023,
- [9] Knowledge Distillation with Graph Neural Networks for Epileptic Seizure Detection [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES: APPLIED DATA SCIENCE AND DEMO TRACK, ECML PKDD 2023, PT VI, 2023, 14174 : 547 - 563
- [10] Geometric Knowledge Distillation: Topology Compression for Graph Neural Networks [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,