Learning multi-tasks with inconsistent labels by using auxiliary big task

被引:2
|
作者
Feng, Quan [1 ,2 ]
Chen, Songcan [1 ,2 ]
机构
[1] Nanjing Univ Aeronaut & Astronaut, Coll Comp Sci & Technol, Nanjing 211106, Peoples R China
[2] Nanjing Univ Aeronaut & Astronaut, MIIT Key Lab Pattern Anal & Machine Intelligence, Nanjing 211106, Peoples R China
关键词
multi-task learning; inconsistent labels; auxiliary task;
D O I
10.1007/s11704-022-2251-x
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Multi-task learning is to improve the performance of the model by transferring and exploiting common knowledge among tasks. Existing MTL works mainly focus on the scenario where label sets among multiple tasks (MTs) are usually the same, thus they can be utilized for learning across the tasks. However, the real world has more general scenarios in which each task has only a small number of training samples and their label sets are just partially overlapped or even not. Learning such MTs is more challenging because of less correlation information available among these tasks. For this, we propose a framework to learn these tasks by jointly leveraging both abundant information from a learnt auxiliary big task with sufficiently many classes to cover those of all these tasks and the information shared among those partially-overlapped tasks. In our implementation of using the same neural network architecture of the learnt auxiliary task to learn individual tasks, the key idea is to utilize available label information to adaptively prune the hidden layer neurons of the auxiliary network to construct corresponding network for each task, while accompanying a joint learning across individual tasks. Extensive experimental results demonstrate that our proposed method is significantly competitive compared to state-of-the-art methods.
引用
收藏
页数:14
相关论文
共 50 条
  • [1] Learning multi-tasks with inconsistent labels by using auxiliary big task
    Quan FENG
    Songcan CHEN
    [J]. Frontiers of Computer Science, 2023, 17 (05) : 119 - 132
  • [2] Learning multi-tasks with inconsistent labels by using auxiliary big task
    Quan Feng
    Songcan Chen
    [J]. Frontiers of Computer Science, 2023, 17
  • [3] Multi-modal Scene Categorization using Multi-tasks Learning
    Peng Xishuai
    Li Yuanxiang
    Luo Jianhua
    Xu Jun
    Lu Yongshuai
    [J]. PROCEEDINGS OF 2016 IEEE 13TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP 2016), 2016, : 1106 - 1111
  • [4] Learning from Semantically Dependent Multi-Tasks
    Liu, Bin
    Xu, Zenglin
    Dai, Bo
    Bai, Haoli
    Fang, Xianghong
    Ren, Yazhou
    Zhe, Shandian
    [J]. 2017 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2017, : 3498 - 3505
  • [5] Multi-task learning with Attention : Constructing auxiliary tasks for learning to learn
    Li, Benying
    Dong, Aimei
    [J]. 2021 IEEE 33RD INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE (ICTAI 2021), 2021, : 145 - 152
  • [6] Estimating the influence of auxiliary tasks for multi-task learning of sequence tagging tasks
    Schroeder, Fynn
    Biemann, Chris
    [J]. 58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 2971 - 2985
  • [7] Leveraging Auxiliary Tasks for Height andWeight Estimation by Multi Task Learning
    Han, Dan
    Zhang, Jie
    Shan, Shiguang
    [J]. IEEE/IAPR INTERNATIONAL JOINT CONFERENCE ON BIOMETRICS (IJCB 2020), 2020,
  • [8] Sample-level weighting for multi-task learning with auxiliary tasks
    Gregoire, Emilie
    Chaudhary, Muhammad Hafeez
    Verboven, Sam
    [J]. APPLIED INTELLIGENCE, 2024, 54 (04) : 3482 - 3501
  • [9] Sample-level weighting for multi-task learning with auxiliary tasks
    Emilie Grégoire
    Muhammad Hafeez Chaudhary
    Sam Verboven
    [J]. Applied Intelligence, 2024, 54 : 3482 - 3501
  • [10] HydaLearnHighly Dynamic Task Weighting for Multitask Learning with Auxiliary Tasks
    Sam Verboven
    Muhammad Hafeez Chaudhary
    Jeroen Berrevoets
    Vincent Ginis
    Wouter Verbeke
    [J]. Applied Intelligence, 2023, 53 : 5808 - 5822