Encoding Tree Sparsity in Multi-Task Learning: A Probabilistic Framework

被引:0
|
作者
Han, Lei [1 ]
Zhang, Yu [2 ,3 ]
Song, Guojie [1 ]
Xie, Kunqing [1 ]
机构
[1] Peking Univ, EECS, Minist Educ, Key Lab Machine Percept, Beijing, Peoples R China
[2] Hong Kong Baptist Univ, Dept Comp Sci, Hong Kong, Peoples R China
[3] Hong Kong Baptist Univ Shenzhen, Inst Res & Continuing Educ, Shenzhen, Peoples R China
基金
国家高技术研究发展计划(863计划);
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-task learning seeks to improve the generalization performance by sharing common information among multiple related tasks. A key assumption in most MTL algorithms is that all tasks are related, which, however, may not hold in many real-world applications. Existing techniques, which attempt to address this issue, aim to identify groups of related tasks using group sparsity. In this paper, we propose a probabilistic tree sparsity (PTS) model to utilize the tree structure to obtain the sparse solution instead of the group structure. Specifically, each model coefficient in the learning model is decomposed into a product of multiple component coefficients each of which corresponds to a node in the tree. Based on the decomposition, Gaussian and Cauchy distributions are placed on the component coefficients as priors to restrict the model complexity. We devise an efficient expectation maximization algorithm to learn the model parameters. Experiments conducted on both synthetic and real-world problems show the effectiveness of our model compared with state-of-the-art baselines.
引用
收藏
页码:1854 / 1860
页数:7
相关论文
共 50 条
  • [41] Learning to Branch for Multi-Task Learning
    Guo, Pengsheng
    Lee, Chen-Yu
    Ulbricht, Daniel
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [42] Learning to Branch for Multi-Task Learning
    Guo, Pengsheng
    Lee, Chen-Yu
    Ulbricht, Daniel
    [J]. 25TH AMERICAS CONFERENCE ON INFORMATION SYSTEMS (AMCIS 2019), 2019,
  • [43] Boosted multi-task learning
    Olivier Chapelle
    Pannagadatta Shivaswamy
    Srinivas Vadrevu
    Kilian Weinberger
    Ya Zhang
    Belle Tseng
    [J]. Machine Learning, 2011, 85 : 149 - 173
  • [44] An overview of multi-task learning
    Zhang, Yu
    Yang, Qiang
    [J]. NATIONAL SCIENCE REVIEW, 2018, 5 (01) : 30 - 43
  • [45] On Partial Multi-Task Learning
    He, Yi
    Wu, Baijun
    Wu, Di
    Wu, Xindong
    [J]. ECAI 2020: 24TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, 325 : 1174 - 1181
  • [46] Asynchronous Multi-Task Learning
    Baytas, Inci M.
    Yan, Ming
    Jain, Anil K.
    Zhou, Jiayu
    [J]. 2016 IEEE 16TH INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2016, : 11 - 20
  • [47] Pareto Multi-Task Learning
    Lin, Xi
    Zhen, Hui-Ling
    Li, Zhenhua
    Zhang, Qingfu
    Kwong, Sam
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [48] Federated Multi-Task Learning
    Smith, Virginia
    Chiang, Chao-Kai
    Sanjabi, Maziar
    Talwalkar, Ameet
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [49] Calibrated Multi-Task Learning
    Nie, Feiping
    Hu, Zhanxuan
    Li, Xuelong
    [J]. KDD'18: PROCEEDINGS OF THE 24TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2018, : 2012 - 2021
  • [50] An overview of multi-task learning
    Yu Zhang
    Qiang Yang
    [J]. National Science Review, 2018, 5 (01) : 30 - 43