Feature selection with kernel class separability

被引:152
|
作者
Wang, Lei [1 ]
机构
[1] Australian Natl Univ, Res Sch Informat Sci & Engn, Dept Informat Engn, Canberra, ACT 0200, Australia
基金
澳大利亚研究理事会;
关键词
kernel class separability; feature selection; Support Vector Machines; Kernel Fisher Discriminant Analysis; pattern classification;
D O I
10.1109/TPAMI.2007.70799
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Classification can often benefit from efficient feature selection. However, the presence of linearly nonseparable data, quick response requirement, small sample problem, and noisy features makes the feature selection quite challenging. In this work, a class separability criterion is developed in a high-dimensional kernel space, and feature selection is performed by the maximization of this criterion. To make this feature selection approach work, the issues of automatic kernel parameter tuning, numerical stability, and regularization for multiparameter optimization are addressed. Theoretical analysis uncovers the relationship of this criterion to the radius- margin bound of the Support Vector Machines (SVMs), the Kernel Fisher Discriminant Analysis (KFDA), and the kernel alignment criterion, thus providing more insight into feature selection with this criterion. This criterion is applied to a variety of selection modes using different search strategies. Extensive experimental study demonstrates its efficiency in delivering fast and robust feature selection.
引用
收藏
页码:1534 / 1546
页数:13
相关论文
共 50 条
  • [41] Feature selection based on information theory, consistency and separability indices.
    Duch, W
    Grabczewski, K
    Winiarski, T
    Biesiada, J
    Kachel, A
    ICONIP'02: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING: COMPUTATIONAL INTELLIGENCE FOR THE E-AGE, 2002, : 1951 - 1955
  • [42] SEPARABILITY OF REPRODUCING KERNEL SPACES
    Owhadi, Houman
    Scovel, Clint
    PROCEEDINGS OF THE AMERICAN MATHEMATICAL SOCIETY, 2017, 145 (05) : 2131 - 2138
  • [43] Channel Selection by Class Separability Measures for Automatic Transcriptions on Distant Microphones
    Woelfel, Matthias
    INTERSPEECH 2007: 8TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION, VOLS 1-4, 2007, : 61 - 64
  • [44] On the equivalence of Kernel Fisher discriminant analysis and Kernel Quadratic Programming Feature Selection
    Rodriguez-Lujan, I.
    Santa Cruz, C.
    Huerta, R.
    PATTERN RECOGNITION LETTERS, 2011, 32 (11) : 1567 - 1571
  • [45] A Minority Class Feature Selection Method
    Cuaya, German
    Munoz-Melendez, Angelica
    Morales, Eduardo F.
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, 2011, 7042 : 417 - 424
  • [46] Kernel Feature Selection via Conditional Covariance Minimization
    Chen, Jianbo
    Stern, Mitchell
    Wainwright, Martin J.
    Jordan, Michael, I
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [47] More Powerful Selective Kernel Tests for Feature Selection
    Lim, Jen Ning
    Yamada, Makoto
    Jitkrittum, Wittawat
    Terada, Yoshikazu
    Matsui, Shigeyuki
    Shimodaira, Hidetoshi
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 820 - 829
  • [48] Remote Sensing Feature Selection by Kernel Dependence Measures
    Camps-Valls, Gustavo
    Mooij, Joris
    Schoelkopf, Bernhard
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2010, 7 (03) : 587 - 591
  • [49] Bilinear analysis for Kernel selection and nonlinear feature extraction
    Yang, Shu
    Yan, Shuicheng
    Zhang, Chao
    Tang, Xiaoou
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2007, 18 (05): : 1442 - 1452
  • [50] Representation and Feature Selection using Multiple Kernel Learning
    Dileep, A. D.
    Sekhar, C. Chandra
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 2218 - 2223