共 50 条
- [31] Enhanced Accuracy and Robustness via Multi-teacher Adversarial Distillation [J]. COMPUTER VISION - ECCV 2022, PT IV, 2022, 13664 : 585 - 602
- [32] Adapt Your Teacher: Improving Knowledge Distillation for Exemplar-free Continual Learning [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3504 - 3509
- [33] Knowledge Distillation with the Reused Teacher Classifier [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 11923 - 11932
- [34] Knowledge Distillation from A Stronger Teacher [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
- [37] MTKDSR: Multi-Teacher Knowledge Distillation for Super Resolution Image Reconstruction [J]. 2022 26TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2022, : 352 - 358
- [39] Improving Stance Detection with Multi-Dataset Learning and Knowledge Distillation [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 6332 - 6345