Encoding Tree Sparsity in Multi-Task Learning: A Probabilistic Framework

被引:0
|
作者
Han, Lei [1 ]
Zhang, Yu [2 ,3 ]
Song, Guojie [1 ]
Xie, Kunqing [1 ]
机构
[1] Peking Univ, EECS, Minist Educ, Key Lab Machine Percept, Beijing, Peoples R China
[2] Hong Kong Baptist Univ, Dept Comp Sci, Hong Kong, Peoples R China
[3] Hong Kong Baptist Univ Shenzhen, Inst Res & Continuing Educ, Shenzhen, Peoples R China
基金
国家高技术研究发展计划(863计划);
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-task learning seeks to improve the generalization performance by sharing common information among multiple related tasks. A key assumption in most MTL algorithms is that all tasks are related, which, however, may not hold in many real-world applications. Existing techniques, which attempt to address this issue, aim to identify groups of related tasks using group sparsity. In this paper, we propose a probabilistic tree sparsity (PTS) model to utilize the tree structure to obtain the sparse solution instead of the group structure. Specifically, each model coefficient in the learning model is decomposed into a product of multiple component coefficients each of which corresponds to a node in the tree. Based on the decomposition, Gaussian and Cauchy distributions are placed on the component coefficients as priors to restrict the model complexity. We devise an efficient expectation maximization algorithm to learn the model parameters. Experiments conducted on both synthetic and real-world problems show the effectiveness of our model compared with state-of-the-art baselines.
引用
收藏
页码:1854 / 1860
页数:7
相关论文
共 50 条
  • [1] Probabilistic movement primitives based multi-task learning framework
    Yue, Chengfei
    Gao, Tian
    Lu, Lang
    Lin, Tao
    Wu, Yunhua
    [J]. COMPUTERS & INDUSTRIAL ENGINEERING, 2024, 191
  • [2] Learning Tree Structure in Multi-Task Learning
    Han, Lei
    Zhang, Yu
    [J]. KDD'15: PROCEEDINGS OF THE 21ST ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2015, : 397 - 406
  • [3] Probabilistic Joint Feature Selection for Multi-task Learning
    Xiong, Tao
    Bi, Jinbo
    Rao, Bharat
    Cherkassky, Vladimir
    [J]. PROCEEDINGS OF THE SEVENTH SIAM INTERNATIONAL CONFERENCE ON DATA MINING, 2007, : 332 - +
  • [4] Multi-task Sparse Gaussian Processes with Improved Multi-task Sparsity Regularization
    Zhu, Jiang
    Sun, Shiliang
    [J]. PATTERN RECOGNITION (CCPR 2014), PT I, 2014, 483 : 54 - 62
  • [5] Multi-task learning framework for echocardiography segmentation
    Monkam, Patrice
    Jin, Songbai
    Lu, Wenkai
    [J]. 2022 IEEE INTERNATIONAL ULTRASONICS SYMPOSIUM (IEEE IUS), 2022,
  • [6] A Regression Model Tree Algorithm by Multi-task Learning
    Jo, Seeun
    Jun, Chi-Hyuck
    [J]. INDUSTRIAL ENGINEERING AND MANAGEMENT SYSTEMS, 2021, 20 (02): : 163 - 171
  • [7] Probabilistic Multi-Task Learning for Visual Saliency Estimation in Video
    Li, Jia
    Tian, Yonghong
    Huang, Tiejun
    Gao, Wen
    [J]. INTERNATIONAL JOURNAL OF COMPUTER VISION, 2010, 90 (02) : 150 - 165
  • [8] Fast and Robust Multi-View Multi-Task Learning via Group Sparsity
    Sun, Lu
    Nguyen, Canh Hao
    Mamitsuka, Hiroshi
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3499 - 3505
  • [9] Probabilistic Multi-Task Learning for Visual Saliency Estimation in Video
    Jia Li
    Yonghong Tian
    Tiejun Huang
    Wen Gao
    [J]. International Journal of Computer Vision, 2010, 90 : 150 - 165
  • [10] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    [J]. Memetic Computing, 2020, 12 : 355 - 369