共 50 条
- [41] A Two-Teacher Framework for Knowledge Distillation ADVANCES IN NEURAL NETWORKS - ISNN 2019, PT I, 2019, 11554 : 58 - 66
- [43] Revisiting Knowledge Distillation: An Inheritance and Exploration Framework 2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 3578 - 3587
- [45] Image classification framework based on knowledge distillation Jilin Daxue Xuebao (Gongxueban)/Journal of Jilin University (Engineering and Technology Edition), 2024, 54 (08): : 2307 - 2312
- [46] A Unified Framework for Real Time Motion Completion THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4459 - 4467
- [48] Knowledge Distillation with a Precise Teacher and Prediction with Abstention 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9000 - 9006
- [49] Ensembled CTR Prediction via Knowledge Distillation CIKM '20: PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, 2020, : 2941 - 2948