共 50 条
- [21] A Multi-Teacher Assisted Knowledge Distillation Approach for Enhanced Face Image Authentication PROCEEDINGS OF THE 2023 ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA RETRIEVAL, ICMR 2023, 2023, : 135 - 143
- [22] MTMS: Multi-teacher Multi-stage Knowledge Distillation for Reasoning-Based Machine Reading Comprehension PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 1995 - 2005
- [23] Dissolved oxygen prediction in the Taiwan Strait with the attention-based multi-teacher knowledge distillation model Ocean and Coastal Management, 2025, 265
- [24] Semi-supervised lung adenocarcinoma histopathology image classification based on multi-teacher knowledge distillation PHYSICS IN MEDICINE AND BIOLOGY, 2024, 69 (18):
- [26] MULTI-TEACHER KNOWLEDGE DISTILLATION FOR COMPRESSED VIDEO ACTION RECOGNITION ON DEEP NEURAL NETWORKS 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 2202 - 2206
- [27] A Multi-teacher Knowledge Distillation Framework for Distantly Supervised Relation Extraction with Flexible Temperature WEB AND BIG DATA, PT II, APWEB-WAIM 2023, 2024, 14332 : 103 - 116
- [28] Enhanced Accuracy and Robustness via Multi-teacher Adversarial Distillation COMPUTER VISION - ECCV 2022, PT IV, 2022, 13664 : 585 - 602