共 50 条
- [3] Decoupled Multi-teacher Knowledge Distillation based on Entropy 2024 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, ISCAS 2024, 2024,
- [5] Multi-Grained Knowledge Distillation for Named Entity Recognition 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 5704 - 5716
- [7] Correlation Guided Multi-teacher Knowledge Distillation NEURAL INFORMATION PROCESSING, ICONIP 2023, PT IV, 2024, 14450 : 562 - 574
- [8] Reinforced Multi-Teacher Selection for Knowledge Distillation THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 14284 - 14291
- [9] Faster biomedical named entity recognition based on knowledge distillation Qinghua Daxue Xuebao/Journal of Tsinghua University, 2021, 61 (09): : 936 - 942
- [10] mKDNAD: A network flow anomaly detection method based on multi-teacher knowledge distillation 2022 16TH IEEE INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP2022), VOL 1, 2022, : 314 - 319