共 50 条
- [21] CONFIDENCE-AWARE MULTI-TEACHER KNOWLEDGE DISTILLATION 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4498 - 4502
- [22] Adaptive multi-teacher multi-level knowledge distillation Neurocomputing, 2021, 415 : 106 - 113
- [25] Robust Semantic Segmentation With Multi-Teacher Knowledge Distillation IEEE ACCESS, 2021, 9 : 119049 - 119066
- [26] UKSSL: Underlying Knowledge Based Semi-Supervised Learning for Medical Image Classification IEEE OPEN JOURNAL OF ENGINEERING IN MEDICINE AND BIOLOGY, 2024, 5 : 459 - 466
- [27] Knowledge Distillation for Semi-supervised Domain Adaptation OR 2.0 CONTEXT-AWARE OPERATING THEATERS AND MACHINE LEARNING IN CLINICAL NEUROIMAGING, 2019, 11796 : 68 - 76
- [28] UNIC: Universal Classification Models via Multi-teacher Distillation COMPUTER VISION-ECCV 2024, PT IV, 2025, 15062 : 353 - 371