共 18 条
- [1] Leveraging logit uncertainty for better knowledge distillation SCIENTIFIC REPORTS, 2024, 14 (01):
- [2] AN INVESTIGATION OF A KNOWLEDGE DISTILLATION METHOD FOR CTC ACOUSTIC MODELS 2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 5809 - 5813
- [5] INVESTIGATION OF SEQUENCE-LEVEL KNOWLEDGE DISTILLATION METHODS FOR CTC ACOUSTIC MODELS 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 6156 - 6160
- [6] Investigation of Sequence-level Knowledge Distillation Methods for CTC Acoustic Models ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings, 2019, 2019-May : 6156 - 6160
- [7] Efficient Knowledge Distillation: Empowering Small Language Models with Teacher Model Insights NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS, PT I, NLDB 2024, 2024, 14762 : 32 - 46
- [8] ONLINE TARGET SOUND EXTRACTION WITH KNOWLEDGE DISTILLATION FROM PARTIALLY NON-CAUSAL TEACHER 2024 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING, ICASSP 2024, 2024, : 561 - 565
- [9] Domain-specific knowledge distillation yields smaller and better models for conversational commerce PROCEEDINGS OF THE 5TH WORKSHOP ON E-COMMERCE AND NLP (ECNLP 5), 2022, : 151 - 160
- [10] Self-Improving Teacher Cultivates Better Student: Distillation Calibration for Multimodal Large Language Models PROCEEDINGS OF THE 47TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, SIGIR 2024, 2024, : 882 - 892