共 50 条
- [1] Correlation Guided Multi-teacher Knowledge Distillation [J]. NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
- [2] Reinforced Multi-Teacher Selection for Knowledge Distillation [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
- [4] CONFIDENCE-AWARE MULTI-TEACHER KNOWLEDGE DISTILLATION [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4498 - 4502
- [6] Adaptive multi-teacher multi-level knowledge distillation [J]. NEUROCOMPUTING, 2020, 415 : 106 - 113
- [7] A Multi-teacher Knowledge Distillation Framework for Distantly Supervised Relation Extraction with Flexible Temperature [J]. WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 103 - 116
- [8] MULTI-TEACHER KNOWLEDGE DISTILLATION FOR COMPRESSED VIDEO ACTION RECOGNITION ON DEEP NEURAL NETWORKS [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 2202 - 2206
- [10] Decoupled Multi-teacher Knowledge Distillation based on Entropy [J]. 2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,