共 50 条
- [1] Online Knowledge Distillation for Multi-task Learning 2023 IEEE/CVF WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV), 2023, : 2358 - 2367
- [2] Multi-Task Knowledge Distillation for Eye Disease Prediction 2021 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION WACV 2021, 2021, : 3982 - 3992
- [4] Application of Knowledge Distillation to Multi-Task Speech Representation Learning INTERSPEECH 2023, 2023, : 2813 - 2817
- [6] Contrastive Multi-Task Dense Prediction THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 3, 2023, : 3190 - 3197
- [7] Knowledge Distillation and Multi-task Feature Learning for Partial Discharge Recognition 2023 IEEE 32ND CONFERENCE ON ELECTRICAL PERFORMANCE OF ELECTRONIC PACKAGING AND SYSTEMS, EPEPS, 2023,
- [8] MULTI-TASK DISTILLATION: TOWARDS MITIGATING THE NEGATIVE TRANSFER IN MULTI-TASK LEARNING 2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 389 - 393
- [9] DeMT: Deformable Mixer Transformer for Multi-Task Learning of Dense Prediction THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 3, 2023, : 3072 - 3080
- [10] Cross-Task Knowledge Distillation in Multi-Task Recommendation THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 4318 - 4326