共 50 条
- [33] Improving Deep Mutual Learning via Knowledge Distillation [J]. APPLIED SCIENCES-BASEL, 2022, 12 (15):
- [34] Improving Neural Topic Models with Wasserstein Knowledge Distillation [J]. ADVANCES IN INFORMATION RETRIEVAL, ECIR 2023, PT II, 2023, 13981 : 321 - 330
- [35] Improving Neural Topic Models using Knowledge Distillation [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1752 - 1771
- [36] Some Shades of Grey! - Interpretability and Explainability of Deep Neural Networks [J]. PROCEEDINGS OF THE ACM WORKSHOP ON CROSSMODAL LEARNING AND APPLICATION (WCRML'19), 2019, : 1 - 1
- [37] Interpretability vs. Complexity: The Friction in Deep Neural Networks [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
- [39] MULTI-TEACHER KNOWLEDGE DISTILLATION FOR COMPRESSED VIDEO ACTION RECOGNITION ON DEEP NEURAL NETWORKS [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 2202 - 2206