共 50 条
- [25] Improving Deep Mutual Learning via Knowledge Distillation APPLIED SCIENCES-BASEL, 2022, 12 (15):
- [26] Improving Neural Topic Models with Wasserstein Knowledge Distillation ADVANCES IN INFORMATION RETRIEVAL, ECIR 2023, PT II, 2023, 13981 : 321 - 330
- [27] Improving solar distillation with a parabolic trough solar concentrator ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 2012, 243
- [28] Improving distillation tray efficiencies, the easy way. 1996 ICHEME RESEARCH EVENT - SECOND EUROPEAN CONFERENCE FOR YOUNG RESEARCHERS IN CHEMICAL ENGINEERING, VOLS 1 AND 2, 1996, : 928 - 930
- [30] Improving the Interpretability of Deep Neural Networks with Knowledge Distillation 2018 18TH IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS (ICDMW), 2018, : 905 - 912