共 26 条
- [11] Token-level and sequence-level loss smoothing for RNN language models PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL), VOL 1, 2018, : 2094 - 2103
- [12] Self-Improvement of Non-autoregressive Model via Sequence-Level Distillation 2023 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2023), 2023, : 14202 - 14212
- [15] Sequence-level models for distortion-rate behaviour of compressed video 2005 International Conference on Image Processing (ICIP), Vols 1-5, 2005, : 1713 - 1716
- [17] Knowledge Distillation For CTC-based Speech Recognition Via Consistent Acoustic Representation Learning INTERSPEECH 2022, 2022, : 2633 - 2637
- [18] DOMAIN ADAPTATION OF DNN ACOUSTIC MODELS USING KNOWLEDGE DISTILLATION 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 5185 - 5189
- [20] Structure-Level Knowledge Distillation For Multilingual Sequence Labeling 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3317 - 3330