共 50 条
- [42] MULTI-TEACHER KNOWLEDGE DISTILLATION FOR COMPRESSED VIDEO ACTION RECOGNITION ON DEEP NEURAL NETWORKS [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 2202 - 2206
- [43] Zero-Shot Knowledge Distillation in Deep Networks [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
- [45] RELIANT: Fair Knowledge Distillation for Graph Neural Networks [J]. PROCEEDINGS OF THE 2023 SIAM INTERNATIONAL CONFERENCE ON DATA MINING, SDM, 2023, : 154 - +
- [47] PDD: Pruning Neural Networks During Knowledge Distillation [J]. COGNITIVE COMPUTATION, 2024, : 3457 - 3467
- [49] Distilling Spikes: Knowledge Distillation in Spiking Neural Networks [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 4536 - 4543