共 50 条
- [1] Improving knowledge distillation via pseudo-multi-teacher network [J]. Machine Vision and Applications, 2023, 34
- [6] Improved Knowledge Distillation via Teacher Assistant [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5191 - 5198
- [7] Reinforced Multi-Teacher Selection for Knowledge Distillation [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
- [8] Correlation Guided Multi-teacher Knowledge Distillation [J]. NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
- [9] Adaptive multi-teacher multi-level knowledge distillation [J]. NEUROCOMPUTING, 2020, 415 : 106 - 113
- [10] Adaptive multi-teacher multi-level knowledge distillation [J]. Neurocomputing, 2021, 415 : 106 - 113