共 50 条
- [2] Student Customized Knowledge Distillation: Bridging the Gap Between Student and Teacher [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 5037 - 5046
- [3] Recruiting the Best Teacher Modality: A Customized Knowledge Distillation Method for if Based Nephropathy Diagnosis [J]. MEDICAL IMAGE COMPUTING AND COMPUTER ASSISTED INTERVENTION, MICCAI 2023, PT V, 2023, 14224 : 526 - 536
- [5] Improving knowledge distillation via pseudo-multi-teacher network [J]. Machine Vision and Applications, 2023, 34
- [7] Adapt Your Teacher: Improving Knowledge Distillation for Exemplar-free Continual Learning [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS, ICCVW, 2023, : 3504 - 3509
- [8] Knowledge Distillation with the Reused Teacher Classifier [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 11923 - 11932
- [9] Knowledge Distillation from A Stronger Teacher [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
- [10] Improving Low-Resource Neural Machine Translation With Teacher-Free Knowledge Distillation [J]. IEEE ACCESS, 2020, 8 : 206638 - 206645