Expectation Propagation for Bayesian Multi-task Feature Selection

被引:0
|
作者
Hernandez-Lobato, Daniel [1 ]
Miguel Hernandez-Lobato, Jose [2 ]
Helleputte, Thibault [1 ]
Dupont, Pierre [1 ]
机构
[1] Catholic Univ Louvain, ICTEAM Inst, Machine Learning Grp, Pl St Barbe 2, B-1348 Louvain, Belgium
[2] Univ Autonoma Madrid, Dept Comp Sci, E-28049 Madrid, Spain
关键词
Multi-task learning; feature selection; expectation propagation; approximate Bayesian inference; GENE-EXPRESSION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we propose a Bayesian model for multi-task feature selection. This model is based on a generalized spike and slab sparse prior distribution that enforces the selection of a common subset of features across several tasks. Since exact Bayesian inference in this model is intractable, approximate inference is performed through expectation propagation (EP). EP approximates the posterior distribution of the model using a parametric probability distribution. This posterior approximation is particularly useful to identify relevant features for prediction. We focus on problems for which the number of features d is significantly larger than the number of instances for each task. We propose an efficient parametrization of the EP algorithm that offers a computational complexity linear in d. Experiments on several multi-task datasets show that the proposed model outperforms baseline approaches for single-task learning or data pooling across all tasks, as well as two state-of-the-art multi-task learning approaches. Additional experiments confirm the stability of the proposed feature selection with respect to various sub-samplings of the training data.
引用
收藏
页码:522 / 537
页数:16
相关论文
共 50 条
  • [21] Multi-task Feature Selection Using the Multiple Inclusion Criterion (MIC)
    Dhillon, Paramveer S.
    Tomasik, Brian
    Foster, Dean
    Ungar, Lyle
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, PT I, 2009, 5781 : 276 - +
  • [22] ON PARALLELIZING MULTI-TASK BAYESIAN OPTIMIZATION
    Groves, Matthew
    Pearce, Michael
    Branke, Juergen
    [J]. 2018 WINTER SIMULATION CONFERENCE (WSC), 2018, : 1993 - 2002
  • [23] Discriminative multi-task multi-view feature selection and fusion for multimedia analysis
    Yang, Ziwei
    Wang, Huiyun
    Han, Yahong
    Zhu, Xianglei
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (03) : 3431 - 3453
  • [24] Discriminative multi-task multi-view feature selection and fusion for multimedia analysis
    Ziwei Yang
    Huiyun Wang
    Yahong Han
    Xianglei Zhu
    [J]. Multimedia Tools and Applications, 2018, 77 : 3431 - 3453
  • [25] Convex multi-task feature learning
    Andreas Argyriou
    Theodoros Evgeniou
    Massimiliano Pontil
    [J]. Machine Learning, 2008, 73 : 243 - 272
  • [26] Multi-Task Feature Interaction Learning
    Lin, Kaixiang
    Xu, Jianpeng
    Baytas, Inci M.
    Ji, Shuiwang
    Zhou, Jiayu
    [J]. KDD'16: PROCEEDINGS OF THE 22ND ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, 2016, : 1735 - 1744
  • [27] Convex multi-task feature learning
    Argyriou, Andreas
    Evgeniou, Theodoros
    Pontil, Massimiliano
    [J]. MACHINE LEARNING, 2008, 73 (03) : 243 - 272
  • [28] Multi-task feature selection with sparse regularization to extract common and task-specific features
    Zhang, Jiashuai
    Miao, Jianyu
    Zhao, Kun
    Tian, Yingjie
    [J]. NEUROCOMPUTING, 2019, 340 : 76 - 89
  • [29] Generalized Spike-and-Slab Priors for Bayesian Group Feature Selection Using Expectation Propagation
    Hernandez-Lobato, Daniel
    Miguel Hernandez-Lobato, Jose
    Dupont, Pierre
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2013, 14 : 1891 - 1945
  • [30] Speaker independent feature selection for speech emotion recognition: A multi-task approach
    Kalhor, Elham
    Bakhtiari, Behzad
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (06) : 8127 - 8146