共 50 条
- [1] Adaptive Multi-Teacher Knowledge Distillation with Meta-Learning [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 1943 - 1948
- [4] A multi-teacher learning automata computing model for graph partitioning problems [J]. Electrical Engineering in Japan (English translation of Denki Gakkai Ronbunshi), 2004, 148 (01): : 46 - 53
- [6] Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
- [8] Adaptive multi-teacher multi-level knowledge distillation [J]. NEUROCOMPUTING, 2020, 415 : 106 - 113
- [9] Reinforced Multi-Teacher Selection for Knowledge Distillation [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
- [10] Adaptive multi-teacher multi-level knowledge distillation [J]. Neurocomputing, 2021, 415 : 106 - 113