共 50 条
- [43] Guiding CTC Posterior Spike Timings for Improved Posterior Fusion and Knowledge Distillation INTERSPEECH 2019, 2019, : 1616 - 1620
- [44] DISCOVER THE EFFECTIVE STRATEGY FOR FACE RECOGNITION MODEL COMPRESSION BY IMPROVED KNOWLEDGE DISTILLATION 2018 25TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2018, : 2416 - 2420
- [45] Improved Knowledge Distillation for Training Fast Low Resolution Face Recognition Model 2019 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION WORKSHOPS (ICCVW), 2019, : 2655 - 2661
- [46] Knowledge Augmentation for Distillation: A General and Effective Approach to Enhance Knowledge Distillation PROCEEDINGS OF THE 1ST INTERNATIONAL WORKSHOP ON EFFICIENT MULTIMEDIA COMPUTING UNDER LIMITED RESOURCES, EMCLR 2024, 2024, : 23 - 31
- [49] Explaining Knowledge Distillation by Quantifying the Knowledge 2020 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2020), 2020, : 12922 - 12932
- [50] Weighted Knowledge Based Knowledge Distillation Transactions of the Korean Institute of Electrical Engineers, 2022, 71 (02): : 431 - 435