共 50 条
- [3] Leveraging Contrastive Learning and Knowledge Distillation for Incomplete Modality Rumor Detection FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 13492 - 13503
- [4] Improved knowledge distillation method with curriculum learning paradigm Jisuanji Jicheng Zhizao Xitong/Computer Integrated Manufacturing Systems, CIMS, 2022, 28 (07): : 2075 - 2082
- [5] Ensemble Knowledge Distillation for Learning Improved and Efficient Networks ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 953 - 960
- [6] Leveraging logit uncertainty for better knowledge distillation SCIENTIFIC REPORTS, 2024, 14 (01):
- [8] Knowledge Building In Organization From The Perspectives Of Different Learning Styles PROCEEDING OF KNOWLEDGE MANAGEMENT INTERNATIONAL CONFERENCE (KMICE) 2014, VOLS 1 AND 2, 2014, : 749 - 753
- [9] MIXED BANDWIDTH ACOUSTIC MODELING LEVERAGING KNOWLEDGE DISTILLATION 2019 IEEE AUTOMATIC SPEECH RECOGNITION AND UNDERSTANDING WORKSHOP (ASRU 2019), 2019, : 509 - 515