A Survey on Multi-Task Learning

被引:701
|
作者
Zhang, Yu [1 ,2 ]
Yang, Qiang [3 ]
机构
[1] Southern Univ Sci & Technol, Dept Comp Sci & Engn, Shenzhen 518055, Guangdong, Peoples R China
[2] Peng Cheng Lab, Shenzhen 518066, Guangdong, Peoples R China
[3] Hong Kong Univ Sci & Technol, Dept Comp Sci & Engn, Hong Kong, Peoples R China
关键词
Task analysis; Training; Computational modeling; Classification algorithms; Transfer learning; Supervised learning; Data models; Multi-task learning; machine learning; artificial intelligence; DEEP NEURAL-NETWORKS; MULTIPLE TASKS; CLASSIFICATION; MODEL; ALGORITHMS; REGRESSION; FRAMEWORK; TRACKING; SPARSITY; RECOVERY;
D O I
10.1109/TKDE.2021.3070203
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-Task Learning (MTL) is a learning paradigm in machine learning and its aim is to leverage useful information contained in multiple related tasks to help improve the generalization performance of all the tasks. In this paper, we give a survey for MTL from the perspective of algorithmic modeling, applications and theoretical analyses. For algorithmic modeling, we give a definition of MTL and then classify different MTL algorithms into five categories, including feature learning approach, low-rank approach, task clustering approach, task relation learning approach and decomposition approach as well as discussing the characteristics of each approach. In order to improve the performance of learning tasks further, MTL can be combined with other learning paradigms including semi-supervised learning, active learning, unsupervised learning, reinforcement learning, multi-view learning and graphical models. When the number of tasks is large or the data dimensionality is high, we review online, parallel and distributed MTL models as well as dimensionality reduction and feature hashing to reveal their computational and storage advantages. Many real-world applications use MTL to boost their performance and we review representative works in this paper. Finally, we present theoretical analyses and discuss several future directions for MTL.
引用
收藏
页码:5586 / 5609
页数:24
相关论文
共 50 条
  • [1] Survey of Multi-Task Learning
    Zhang Y.
    Liu J.-W.
    Zuo X.
    [J]. 1600, Science Press (43): : 1340 - 1378
  • [2] Survey on Multi-Task Learning in Smart Transportation
    Alzahrani, Mohammed
    Wang, Qianlong
    Liao, Weixian
    Chen, Xuhui
    Yu, Wei
    [J]. IEEE ACCESS, 2024, 12 : 17023 - 17044
  • [3] A Survey of Multi-Task Deep Reinforcement Learning
    Vithayathil Varghese, Nelson
    Mahmoud, Qusay H.
    [J]. ELECTRONICS, 2020, 9 (09) : 1 - 21
  • [4] A Survey of Multi-task Learning Methods in Chemoinformatics
    Sosnin, Sergey
    Vashurina, Mariia
    Withnall, Michael
    Karpov, Pavel
    Fedorov, Maxim
    Tetko, Igor V.
    [J]. MOLECULAR INFORMATICS, 2019, 38 (04)
  • [5] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    [J]. Memetic Computing, 2020, 12 : 355 - 369
  • [6] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    [J]. MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [7] A survey on kernel-based multi-task learning
    Ruiz, Carlos
    Alaiz, Carlos M.
    Dorronsoro, Jose R.
    [J]. NEUROCOMPUTING, 2024, 577
  • [8] Multi-Task Learning for Dense Prediction Tasks: A Survey
    Vandenhende, Simon
    Georgoulis, Stamatios
    Van Gansbeke, Wouter
    Proesmans, Marc
    Dai, Dengxin
    Van Gool, Luc
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (07) : 3614 - 3633
  • [9] Survey on multi-task learning for object classification and recognition
    Li H.
    Wang F.
    Ding W.
    [J]. Hangkong Xuebao/Acta Aeronautica et Astronautica Sinica, 2022, 43 (01):
  • [10] Drone-Based Tower Survey by Multi-Task Learning
    Sami, Mirza Tanzim
    Yan, Da
    Huang, Huang
    Liang, Xinyu
    Guo, Guimu
    Jiang, Zhe
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 6011 - 6013