共 50 条
- [1] Temperature Annealing Knowledge Distillation from Averaged Teacher [J]. 2022 IEEE 42ND INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS WORKSHOPS (ICDCSW), 2022, : 133 - 138
- [2] Knowledge Distillation with the Reused Teacher Classifier [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 11923 - 11932
- [4] Knowledge distillation: A good teacher is patient and consistent [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 10915 - 10924
- [5] Knowledge Distillation with a Precise Teacher and Prediction with Abstention [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9000 - 9006
- [6] A Two-Teacher Framework for Knowledge Distillation [J]. ADVANCES IN NEURAL NETWORKS - ISNN 2019, PT I, 2019, 11554 : 58 - 66
- [7] Improved Knowledge Distillation via Teacher Assistant [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5191 - 5198
- [10] Reinforced Multi-Teacher Selection for Knowledge Distillation [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291