共 50 条
- [31] Ensemble Compressed Language Model Based on Knowledge Distillation and Multi-Task Learning 2022 7TH INTERNATIONAL CONFERENCE ON BUSINESS AND INDUSTRIAL RESEARCH (ICBIR2022), 2022, : 72 - 77
- [32] Multi-task Transfer with Practice 2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
- [34] Multi-Task Ensemble Learning for Affect Recognition PROCEEDINGS OF THE 2018 ACM INTERNATIONAL JOINT CONFERENCE ON PERVASIVE AND UBIQUITOUS COMPUTING AND PROCEEDINGS OF THE 2018 ACM INTERNATIONAL SYMPOSIUM ON WEARABLE COMPUTERS (UBICOMP/ISWC'18 ADJUNCT), 2018, : 553 - 558
- [36] A Learner-Independent Knowledge Transfer Approach to Multi-task Learning Cognitive Computation, 2014, 6 : 304 - 320
- [37] Knowledge Transfer in Multi-Task Deep Reinforcement Learning for Continuous Control ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
- [38] MULTI-TASK DISTILLATION: TOWARDS MITIGATING THE NEGATIVE TRANSFER IN MULTI-TASK LEARNING 2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 389 - 393
- [39] Large-Scale Evolutionary Optimization via Multi-Task Random Grouping 2021 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2021, : 778 - 783