共 50 条
- [1] Improving Neural Topic Models using Knowledge Distillation [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1752 - 1771
- [3] Improving the Interpretability of Deep Neural Networks with Knowledge Distillation [J]. 2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 905 - 912
- [4] Towards Understanding and Improving Knowledge Distillation for Neural Machine Translation [J]. PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 1, 2023, : 8062 - 8079
- [6] Improving Robustness of Compressed Models with Weight Sharing through Knowledge Distillation [J]. 2024 IEEE 10TH INTERNATIONAL CONFERENCE ON EDGE COMPUTING AND SCALABLE CLOUD, EDGECOM 2024, 2024, : 13 - 21
- [7] Diversity-Aware Coherence Loss for Improving Neural Topic Models [J]. 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1710 - 1722
- [8] Performance-Aware Mutual Knowledge Distillation for Improving Neural Architecture Search [J]. 2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2022, : 11912 - 11922
- [9] Improving Route Choice Models by Incorporating Contextual Factors via Knowledge Distillation [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,