Compressed Hierarchical Representations for Multi-Task Learning and Task Clustering

被引:0
|
作者
de Freitas, Joao Machado [1 ]
Berg, Sebastian [2 ]
Geiger, Bernhard C. [1 ]
Muecke, Manfred [2 ]
机构
[1] Know Ctr GmbH, Graz, Austria
[2] Mat Ctr Leoben Forsch GmbH, Leoben, Austria
关键词
Representation learning; multi-task learning; disentanglement; information bottleneck;
D O I
10.1109/IJCNN55064.2022.9892342
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we frame homogeneous-feature multi-task learning (MTL) as a hierarchical representation learning problem, with one task-agnostic and multiple task-specific latent representations. Drawing inspiration from the information bottleneck principle and assuming an additive independent noise model between the task-agnostic and task-specific latent representations, we limit the information contained in each task-specific representation. It is shown that our resulting representations yield competitive performance for several MTL benchmarks. Furthermore, for certain setups, we show that the trained parameters of the additive noise model are closely related to the similarity of different tasks. This indicates that our approach yields a task-agnostic representation that is disentangled in the sense that its individual dimensions may be interpretable from a task-specific perspective.
引用
收藏
页数:8
相关论文
共 50 条
  • [11] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    [J]. Memetic Computing, 2020, 12 : 355 - 369
  • [12] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    [J]. MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [13] Learning potential functions and their representations for multi-task reinforcement learning
    Matthijs Snel
    Shimon Whiteson
    [J]. Autonomous Agents and Multi-Agent Systems, 2014, 28 : 637 - 681
  • [14] Learning potential functions and their representations for multi-task reinforcement learning
    Snel, Matthijs
    Whiteson, Shimon
    [J]. AUTONOMOUS AGENTS AND MULTI-AGENT SYSTEMS, 2014, 28 (04) : 637 - 681
  • [15] Multi-task subspace clustering
    Zhong, Guo
    Pun, Chi -Man
    [J]. INFORMATION SCIENCES, 2024, 661
  • [16] Hierarchical Gaussian Processes model for multi-task learning
    Li, Ping
    Chen, Songcan
    [J]. PATTERN RECOGNITION, 2018, 74 : 134 - 144
  • [17] Coded Distributed Computing for Hierarchical Multi-task Learning
    Hu, Haoyang
    Li, Songze
    Cheng, Minquan
    Wu, Youlong
    [J]. 2023 IEEE INFORMATION THEORY WORKSHOP, ITW, 2023, : 480 - 485
  • [18] Unsupervised Multi-task Learning with Hierarchical Data Structure
    Cao, Wenming
    Qian, Sheng
    Wu, Si
    Wong, Hau-San
    [J]. PATTERN RECOGNITION, 2019, 86 : 248 - 264
  • [19] Hierarchical Multi-Task Learning for Healthy Drink Classification
    Park, Homin
    Bharadhwaj, Homanga
    Lim, Brian Y.
    [J]. 2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [20] Multi-task Learning for Author Profiling with Hierarchical Features
    Jiang, Zhile
    Yu, Shuai
    Qu, Qiang
    Yang, Min
    Luo, Junyu
    Liu, Juncheng
    [J]. COMPANION PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2018 (WWW 2018), 2018, : 55 - 56