A sparse Bayesian approach for joint feature selection and classifier learning

被引:9
|
作者
Lapedriza, Agata [1 ,2 ]
Segui, Santi [1 ]
Masip, David [1 ,3 ]
Vitria, Jordi [1 ,4 ]
机构
[1] Univ Autonoma Barcelona, Comp Vis Ctr, E-08193 Barcelona, Spain
[2] Univ Autonoma Barcelona, Dept Informat, E-08193 Barcelona, Spain
[3] Univ Oberta Catalunya, Barcelona 08018, Spain
[4] Univ Barcelona, Dept Matemat Aplicada & Anal, Barcelona, Spain
关键词
feature selection; Bayesian learning; classification;
D O I
10.1007/s10044-008-0130-1
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we present a new method for Joint Feature Selection and Classifier Learning using a sparse Bayesian approach. These tasks are performed by optimizing a global loss function that includes a term associated with the empirical loss and another one representing a feature selection and regularization constraint on the parameters. To minimize this function we use a recently proposed technique, the Boosted Lasso algorithm, that follows the regularization path of the empirical risk associated with our loss function. We develop the algorithm for a well known non-parametrical classification method, the relevance vector machine, and perform experiments using a synthetic data set and three databases from the UCI Machine Learning Repository. The results show that our method is able to select the relevant features, increasing in some cases the classification accuracy when feature selection is performed.
引用
收藏
页码:299 / 308
页数:10
相关论文
共 50 条
  • [21] Unsupervised feature selection based on joint spectral learning and general sparse regression
    Tao Chen
    Yanrong Guo
    Shijie Hao
    Neural Computing and Applications, 2020, 32 : 6581 - 6589
  • [22] Unsupervised feature selection via joint local learning and group sparse regression
    Yue Wu
    Can Wang
    Yue-qing Zhang
    Jia-jun Bu
    Frontiers of Information Technology & Electronic Engineering, 2019, 20 : 538 - 553
  • [23] Unsupervised feature selection via joint local learning and group sparse regression
    Wu, Yue
    Wang, Can
    Zhang, Yue-qing
    Bu, Jia-jun
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2019, 20 (04) : 538 - 553
  • [24] Joint Multitask Feature Learning and Classifier Design
    Gutta, Sandeep
    Cheng, Qi
    2013 47TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2013,
  • [25] Joint Semi-Supervised Feature Selection and Classification through Bayesian Approach
    Jiang, Bingbing
    Wu, Xingyu
    Yu, Kui
    Chen, Huanhuan
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 3983 - 3990
  • [26] Synthesis of feature value reduction and feature selection for the improvement of Naive Bayesian classifier
    Zhang, Xuefeng
    Lhi, Peng
    SIXTH WUHAN INTERNATIONAL CONFERENCE ON E-BUSINESS, VOLS 1-4: MANAGEMENT CHALLENGES IN A GLOBAL WORLD, 2007, : 1686 - 1691
  • [27] Feature selection for the naive Bayesian classifier using decision trees
    Ratanamahatana, C
    Gunopulos, D
    APPLIED ARTIFICIAL INTELLIGENCE, 2003, 17 (5-6) : 475 - 487
  • [28] Joint sparse latent representation learning and dual manifold regularization for unsupervised feature selection
    Huang, Mengshi
    Chen, Hongmei
    Mi, Yong
    Luo, Chuan
    Horng, Shi-Jinn
    Li, Tianrui
    KNOWLEDGE-BASED SYSTEMS, 2023, 282
  • [29] Sparse bayesian learning for genomic selection in yeast
    Ayat, Maryam
    Domaratzki, Mike
    FRONTIERS IN BIOINFORMATICS, 2022, 2
  • [30] Joint Feature Selection and Extraction With Sparse Unsupervised Projection
    Wang, Jingyu
    Wang, Lin
    Nie, Feiping
    Li, Xuelong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (06) : 3071 - 3081