共 50 条
- [2] Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks [J]. 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 6424 - 6432
- [3] Reinforced Multi-Teacher Selection for Knowledge Distillation [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
- [4] Correlation Guided Multi-teacher Knowledge Distillation [J]. NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
- [6] CONFIDENCE-AWARE MULTI-TEACHER KNOWLEDGE DISTILLATION [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4498 - 4502
- [7] Adaptive multi-teacher multi-level knowledge distillation [J]. NEUROCOMPUTING, 2020, 415 : 106 - 113
- [8] Adaptive multi-teacher multi-level knowledge distillation [J]. Neurocomputing, 2021, 415 : 106 - 113
- [10] Decoupled Multi-teacher Knowledge Distillation based on Entropy [J]. 2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,