共 50 条
- [1] Online Knowledge Distillation for Multi-task Learning [J]. 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2358 - 2367
- [2] Multi-Task Knowledge Distillation for Eye Disease Prediction [J]. 2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 3982 - 3992
- [5] Knowledge Distillation and Multi-task Feature Learning for Partial Discharge Recognition [J]. 2023 IEEE 32ND CONFERENCE ON ELECTRICAL PERFORMANCE OF ELECTRONIC PACKAGING AND SYSTEMS, EPEPS, 2023,
- [6] MULTI-TASK DISTILLATION: TOWARDS MITIGATING THE NEGATIVE TRANSFER IN MULTI-TASK LEARNING [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 389 - 393
- [7] Cross-Task Knowledge Distillation in Multi-Task Recommendation [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4318 - 4326
- [9] Speech Emotion: Investigating Model Representations, Multi-Task Learning and Knowledge Distillation [J]. INTERSPEECH 2022, 2022, : 4715 - 4719
- [10] MAKING PUNCTUATION RESTORATION ROBUST AND FAST WITH MULTI-TASK LEARNING AND KNOWLEDGE DISTILLATION [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 7773 - 7777