共 50 条
- [1] MTKD: Multi-Teacher Knowledge Distillation for Image Super-Resolution COMPUTER VISION - ECCV 2024, PT XXXIX, 2025, 15097 : 364 - 382
- [2] Correlation Guided Multi-teacher Knowledge Distillation NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
- [3] Reinforced Multi-Teacher Selection for Knowledge Distillation THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
- [5] A Multi-Teacher Assisted Knowledge Distillation Approach for Enhanced Face Image Authentication PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, : 135 - 143
- [6] CONFIDENCE-AWARE MULTI-TEACHER KNOWLEDGE DISTILLATION 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4498 - 4502
- [10] Decoupled Multi-teacher Knowledge Distillation based on Entropy 2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,