HIERARCHICAL MULTI-TASK LEARNING VIA TASK AFFINITY GROUPINGS

被引:0
|
作者
Srivastava, Siddharth [1 ]
Bhugra, Swati [2 ]
Kaushik, Vinay [3 ]
Lall, Brejesh [2 ]
机构
[1] TensorTour, Gurgaon, Haryana, India
[2] Indian Inst Technol Delhi, New Delhi, India
[3] Dview, Bengaluru, Karnataka, India
关键词
Multi-task Learning; Detection; Semantic Segmentation; Depth Prediction; Inter-Task Affinity;
D O I
10.1109/ICIP49359.2023.10223053
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-task learning (MTL) permits joint task learning based on a shared deep learning architecture and multiple loss functions. Despite the recent advances in MTL, one loss often dominates the learning optimization in multiple unrelated tasks. This often results in poor performance compared to the corresponding single task learning. To overcome the aforementioned "negative transfer", we propose a novel hierarchical framework that leverages task relations via inter-task affinity to supervise multi-task learning. Specifically, the inter-task affinity generated task sets, with low-level task set and complex task set at the bottom and top layers respectively, enables iterative multi-task information sharing. In addition, it also alleviates simultaneous image annotations for multiple tasks. The proposed framework achieves state-of-the-art results on classification, detection, semantic segmentation and depth estimation across three standard benchmarks. Furthermore, with state of the results on two benchmarks for image retrieval task, we also demonstrate that the embeddings learned using such a framework provide good generalization and robust representation learning.
引用
收藏
页码:3289 / 3293
页数:5
相关论文
共 50 条
  • [1] Efficiently Identifying Task Groupings for Multi-Task Learning
    Fifty, Christopher
    Amid, Ehsan
    Zhao, Zhe
    Yu, Tianhe
    Anil, Rohan
    Finn, Chelsea
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] Hierarchical Prompt Learning for Multi-Task Learning
    Liu, Yajing
    Lu, Yuning
    Liu, Hao
    An, Yaozu
    Xu, Zhuoran
    Yao, Zhuokun
    Zhang, Baofeng
    Xiong, Zhiwei
    Gui, Chenguang
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 10888 - 10898
  • [3] Compressed Hierarchical Representations for Multi-Task Learning and Task Clustering
    de Freitas, Joao Machado
    Berg, Sebastian
    Geiger, Bernhard C.
    Muecke, Manfred
    [J]. 2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [4] HFedMTL: Hierarchical Federated Multi-Task Learning
    Yi, Xingfu
    Li, Rongpeng
    Peng, Chenghui
    Wu, Jianjun
    Zhao, Zhifeng
    [J]. 2022 IEEE 33RD ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (IEEE PIMRC), 2022,
  • [5] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    [J]. Memetic Computing, 2020, 12 : 355 - 369
  • [6] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    [J]. MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [7] Coded Distributed Computing for Hierarchical Multi-task Learning
    Hu, Haoyang
    Li, Songze
    Cheng, Minquan
    Wu, Youlong
    [J]. 2023 IEEE INFORMATION THEORY WORKSHOP, ITW, 2023, : 480 - 485
  • [8] Unsupervised Multi-task Learning with Hierarchical Data Structure
    Cao, Wenming
    Qian, Sheng
    Wu, Si
    Wong, Hau-San
    [J]. PATTERN RECOGNITION, 2019, 86 : 248 - 264
  • [9] Hierarchical Gaussian Processes model for multi-task learning
    Li, Ping
    Chen, Songcan
    [J]. PATTERN RECOGNITION, 2018, 74 : 134 - 144
  • [10] Multi-Task Hierarchical Imitation Learning for Home Automation
    Fox, Roy
    Berenstein, Ron
    Stoica, Ion
    Goldberg, Ken
    [J]. 2019 IEEE 15TH INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING (CASE), 2019, : 1380 - 1387