共 50 条
- [31] Ultrafast Video Attention Prediction with Coupled Knowledge Distillation [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 10802 - 10809
- [32] Knowledge Distillation for Energy Consumption Prediction in Additive Manufacturing [J]. IFAC PAPERSONLINE, 2022, 55 (02): : 390 - 395
- [35] A rolling bearing fault diagnosis method based on multi-scale knowledge distillation and continual learning [J]. Zhendong yu Chongji/Journal of Vibration and Shock, 2024, 43 (12): : 276 - 285
- [36] Contrastive Supervised Distillation for Continual Representation Learning [J]. IMAGE ANALYSIS AND PROCESSING, ICIAP 2022, PT I, 2022, 13231 : 597 - 609
- [38] Regularizing Brain Age Prediction via Gated Knowledge Distillation [J]. INTERNATIONAL CONFERENCE ON MEDICAL IMAGING WITH DEEP LEARNING, VOL 172, 2022, 172 : 1430 - 1443
- [39] Multi-Task Learning with Knowledge Distillation for Dense Prediction [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 21493 - 21502
- [40] Multi-Task Knowledge Distillation for Eye Disease Prediction [J]. 2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 3982 - 3992