共 50 条
- [22] Knowledge Fusion Distillation: Improving Distillation with Multi-scale Attention Mechanisms [J]. Neural Processing Letters, 2023, 55 : 6165 - 6180
- [24] Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1943 - 1948
- [28] MTUW-GAN: A Multi-Teacher Knowledge Distillation Generative Adversarial Network for Underwater Image Enhancement [J]. APPLIED SCIENCES-BASEL, 2024, 14 (02):
- [29] Data-Free Low-Bit Quantization via Dynamic Multi-teacher Knowledge Distillation [J]. PATTERN RECOGNITION AND COMPUTER VISION, PRCV 2023, PT VIII, 2024, 14432 : 28 - 41
- [30] MulDE: Multi-teacher Knowledge Distillation for Low-dimensional Knowledge Graph Embeddings [J]. PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 1716 - 1726