A sparse Bayesian approach for joint feature selection and classifier learning

被引:9
|
作者
Lapedriza, Agata [1 ,2 ]
Segui, Santi [1 ]
Masip, David [1 ,3 ]
Vitria, Jordi [1 ,4 ]
机构
[1] Univ Autonoma Barcelona, Comp Vis Ctr, E-08193 Barcelona, Spain
[2] Univ Autonoma Barcelona, Dept Informat, E-08193 Barcelona, Spain
[3] Univ Oberta Catalunya, Barcelona 08018, Spain
[4] Univ Barcelona, Dept Matemat Aplicada & Anal, Barcelona, Spain
关键词
feature selection; Bayesian learning; classification;
D O I
10.1007/s10044-008-0130-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we present a new method for Joint Feature Selection and Classifier Learning using a sparse Bayesian approach. These tasks are performed by optimizing a global loss function that includes a term associated with the empirical loss and another one representing a feature selection and regularization constraint on the parameters. To minimize this function we use a recently proposed technique, the Boosted Lasso algorithm, that follows the regularization path of the empirical risk associated with our loss function. We develop the algorithm for a well known non-parametrical classification method, the relevance vector machine, and perform experiments using a synthetic data set and three databases from the UCI Machine Learning Repository. The results show that our method is able to select the relevant features, increasing in some cases the classification accuracy when feature selection is performed.
引用
收藏
页码:299 / 308
页数:10
相关论文
共 50 条
  • [41] Sparse Bayesian Learning based on an efficient subset selection
    Bo, LF
    Wang, L
    Rao, LC
    ADVANCES IN NEURAL NETWORKS - ISNN 2004, PT 1, 2004, 3173 : 264 - 269
  • [42] Sparse Bayesian learning approach for baseline correction
    Li, Haoran
    Dai, Jisheng
    Pan, Tianhong
    Chang, Chunqi
    So, Hing Cheung
    CHEMOMETRICS AND INTELLIGENT LABORATORY SYSTEMS, 2020, 204
  • [43] Sparse Extreme Learning Machine Classifier Using Empirical Feature Mapping
    Kitamura, Takuya
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I, 2016, 9886 : 486 - 493
  • [44] Communication-Efficient Decentralized Sparse Bayesian Learning of Joint Sparse Signals
    Khanna, Saurabh
    Murthy, Chandra R.
    IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2017, 3 (03): : 617 - 630
  • [45] Feature selection by combining subspace learning with sparse representation
    Cheng, Debo
    Zhang, Shichao
    Liu, Xingyi
    Sun, Ke
    Zong, Ming
    MULTIMEDIA SYSTEMS, 2017, 23 (03) : 285 - 291
  • [46] Robust Sparse Subspace Learning for Unsupervised Feature Selection
    Wang, Feng
    Rao, Qi
    Zhang, Yongquan
    Chen, Xu
    2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 4205 - 4212
  • [47] Novel sparse Bayesian feature selection methods for microarray analysis and QSAR variable selection
    Winkler, David
    Burden, Frank
    Halley, Julianne
    DRUGS OF THE FUTURE, 2007, 32 : 26 - 26
  • [48] Feature selection by combining subspace learning with sparse representation
    Debo Cheng
    Shichao Zhang
    Xingyi Liu
    Ke Sun
    Ming Zong
    Multimedia Systems, 2017, 23 : 285 - 291
  • [49] Leukocyte classification based on feature selection using extra trees classifier: a transfer learning approach
    Baby, Diana
    Devaraj, Sujitha Juliet
    Hemanth, Jude
    Raj, Anishin M. M.
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2021, 29 : 2742 - 2757
  • [50] A Bayesian approach to learning classifier systems in uncertain environments
    Aliprandi, Davide
    Mancastroppa, Alex
    Matteucci, Matteo
    GECCO 2006: GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, VOL 1 AND 2, 2006, : 1537 - +