共 50 条
- [22] KNOWLEDGE DISTILLATION FOR WIRELESS EDGE LEARNING 2021 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP), 2021, : 600 - 604
- [23] Noise as a Resource for Learning in Knowledge Distillation 2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 3128 - 3137
- [24] Improved Knowledge Distillation via Teacher Assistant THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5191 - 5198
- [26] Learning Interpretation with Explainable Knowledge Distillation 2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 705 - 714
- [27] A Survey of Knowledge Distillation in Deep Learning Jisuanji Xuebao/Chinese Journal of Computers, 2022, 45 (08): : 1638 - 1673
- [30] Continual Learning Based on Knowledge Distillation and Representation Learning ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2022, PT IV, 2022, 13532 : 27 - 38