共 50 条
- [3] Leveraging logit uncertainty for better knowledge distillation SCIENTIFIC REPORTS, 2024, 14 (01):
- [4] MIXED BANDWIDTH ACOUSTIC MODELING LEVERAGING KNOWLEDGE DISTILLATION 2019 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU 2019), 2019, : 509 - 515
- [5] Improved Knowledge Distillation via Teacher Assistant THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 5191 - 5198
- [6] Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 13492 - 13503
- [7] Leveraging Speech Production Knowledge for Improved Speech Recognition 2009 IEEE WORKSHOP ON AUTOMATIC SPEECH RECOGNITION & UNDERSTANDING (ASRU 2009), 2009, : 58 - 63
- [9] Multimodal fusion and knowledge distillation for improved anomaly detection VISUAL COMPUTER, 2024,
- [10] Improved Knowledge Distillation for Crowd Counting on IoT Devices 2023 IEEE INTERNATIONAL CONFERENCE ON EDGE COMPUTING AND COMMUNICATIONS, EDGE, 2023, : 207 - 214