Compressed Hierarchical Representations for Multi-Task Learning and Task Clustering

被引:0
|
作者
de Freitas, Joao Machado [1 ]
Berg, Sebastian [2 ]
Geiger, Bernhard C. [1 ]
Muecke, Manfred [2 ]
机构
[1] Know Ctr GmbH, Graz, Austria
[2] Mat Ctr Leoben Forsch GmbH, Leoben, Austria
关键词
Representation learning; multi-task learning; disentanglement; information bottleneck;
D O I
10.1109/IJCNN55064.2022.9892342
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we frame homogeneous-feature multi-task learning (MTL) as a hierarchical representation learning problem, with one task-agnostic and multiple task-specific latent representations. Drawing inspiration from the information bottleneck principle and assuming an additive independent noise model between the task-agnostic and task-specific latent representations, we limit the information contained in each task-specific representation. It is shown that our resulting representations yield competitive performance for several MTL benchmarks. Furthermore, for certain setups, we show that the trained parameters of the additive noise model are closely related to the similarity of different tasks. This indicates that our approach yields a task-agnostic representation that is disentangled in the sense that its individual dimensions may be interpretable from a task-specific perspective.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Unsupervised Task Clustering for Multi-task Reinforcement Learning
    Ackermann, Johannes
    Richter, Oliver
    Wattenhofer, Roger
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, 2021, 12975 : 222 - 237
  • [2] Convex Multi-Task Learning by Clustering
    Barzilai, Aviad
    Crammer, Koby
    [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 38, 2015, 38 : 65 - 73
  • [3] Hierarchical Prompt Learning for Multi-Task Learning
    Liu, Yajing
    Lu, Yuning
    Liu, Hao
    An, Yaozu
    Xu, Zhuoran
    Yao, Zhuokun
    Zhang, Baofeng
    Xiong, Zhiwei
    Gui, Chenguang
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2023, : 10888 - 10898
  • [4] HIERARCHICAL MULTI-TASK LEARNING VIA TASK AFFINITY GROUPINGS
    Srivastava, Siddharth
    Bhugra, Swati
    Kaushik, Vinay
    Lall, Brejesh
    [J]. 2023 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, ICIP, 2023, : 3289 - 3293
  • [5] A Multi-task Approach to Learning Multilingual Representations
    Singla, Karan
    Can, Dogan
    Narayanan, Shrikanth
    [J]. PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2018, : 214 - 220
  • [6] Learning Gait Representations with Noisy Multi-Task Learning
    Cosma, Adrian
    Radoi, Emilian
    [J]. SENSORS, 2022, 22 (18)
  • [7] HFedMTL: Hierarchical Federated Multi-Task Learning
    Yi, Xingfu
    Li, Rongpeng
    Peng, Chenghui
    Wu, Jianjun
    Zhao, Zhifeng
    [J]. 2022 IEEE 33RD ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS (IEEE PIMRC), 2022,
  • [8] Multi-Task Clustering with Model Relation Learning
    Zhang, Xiaotong
    Zhang, Xianchao
    Liu, Han
    Luo, Jiebo
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 3132 - 3140
  • [9] Hierarchical Clustering Multi-Task Learning for Joint Human Action Grouping and Recognition
    Liu, An-An
    Su, Yu-Ting
    Nie, Wei-Zhi
    Kankanhalli, Mohan
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2017, 39 (01) : 102 - 114
  • [10] Analysis on Compressed Domain: A Multi-Task Learning Approach
    Zhang, Yuefeng
    Jia, Chuanmin
    Chang, Jianhui
    Ma, Siwei
    [J]. DCC 2022: 2022 DATA COMPRESSION CONFERENCE (DCC), 2022, : 494 - 494