Sparse Bayesian Approach for Feature Selection

被引:0
|
作者
Li, Chang [1 ]
Chen, Huanhuan [1 ]
机构
[1] Univ Sci & Technol China, Sch Comp Sci & Technol, UBRI, Hefei 230027, Peoples R China
关键词
VECTOR MACHINE; CLASSIFICATION; RELEVANCE; CANCER;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper employs sparse Bayesian approach to enable the Probabilistic Classification Vector Machine (PCVM) to select a relevant subset of features. Because of probabilistic outputs and the ability to automatically optimize the regularization items, the sparse Bayesian framework has shown great advantages in real-world applications. However, the Gaussian priors that introduce the same prior to different classes may lead to instability in the classifications. An improved Gaussian prior, whose sign is determined by the class label, is adopt in PCVM. In this paper, we present a joint classifier and feature learning algorithm: Feature Selection Probabilistic Classification Vector Machine (FPCVM). The improved Gaussian priors, named as truncated Gaussian prior, are introduced into the feature space for feature selection, and into the sample space to generate sparsity to the weight parameters, respectively. The expectation-maximization (EM) algorithm is employed to obtain a maximum a posteriori (MAP) estimation of these parameters. In experiments, both the accuracy of classification and performance of feature selection are evaluated on synthetic datasets, benchmark datasets and high-dimensional gene expression datasets.
引用
收藏
页码:7 / 13
页数:7
相关论文
共 50 条
  • [1] A sparse Bayesian approach for joint feature selection and classifier learning
    Àgata Lapedriza
    Santi Seguí
    David Masip
    Jordi Vitrià
    [J]. Pattern Analysis and Applications, 2008, 11 : 299 - 308
  • [2] A sparse Bayesian approach for joint feature selection and classifier learning
    Lapedriza, Agata
    Segui, Santi
    Masip, David
    Vitria, Jordi
    [J]. PATTERN ANALYSIS AND APPLICATIONS, 2008, 11 (3-4) : 299 - 308
  • [3] Feature selection using sparse Bayesian inference
    Brandes, T. Scott
    Baxter, James R.
    Woodworth, Jonathan
    [J]. ALGORITHMS FOR SYNTHETIC APERTURE RADAR IMAGERY XXI, 2014, 9093
  • [4] BAYESIAN FEATURE SELECTION FOR SPARSE TOPIC MODEL
    Chang, Ying-Lan
    Lee, Kuen-Feng
    Chien, Jen-Tzung
    [J]. 2011 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2011,
  • [5] The Relevance Sample-Feature Machine: A Sparse Bayesian Learning Approach to Joint Feature-Sample Selection
    Mohsenzadeh, Yalda
    Sheikhzadeh, Hamid
    Reza, Ali M.
    Bathaee, Najmehsadat
    Kalayeh, Mahdi M.
    [J]. IEEE TRANSACTIONS ON CYBERNETICS, 2013, 43 (06) : 2241 - 2254
  • [6] A double-layer ELM with added feature selection ability using a sparse Bayesian approach
    Kiaee, Farkhondeh
    Gagne, Christian
    Sheikhzadeh, Hamid
    [J]. NEUROCOMPUTING, 2016, 216 : 371 - 380
  • [7] Novel sparse Bayesian feature selection methods for microarray analysis and QSAR variable selection
    Winkler, David
    Burden, Frank
    Halley, Julianne
    [J]. DRUGS OF THE FUTURE, 2007, 32 : 26 - 26
  • [8] A feature selection Bayesian approach for a clustering genetic algorithm
    Hruschka, ER
    Hruschka, ER
    Ebecken, NFF
    [J]. DATA MINING IV, 2004, 7 : 181 - 192
  • [9] A Bayesian approach to joint feature selection and classifier design
    Krishnapuram, B
    Hartemink, AJ
    Carin, L
    Figueiredo, MAT
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2004, 26 (09) : 1105 - 1111
  • [10] A Bayesian Approach to Sparse Model Selection in Statistical Shape Models
    Gooya, Ali
    Davatzikos, Christos
    Frangi, Alejandro F.
    [J]. SIAM JOURNAL ON IMAGING SCIENCES, 2015, 8 (02): : 858 - 887