Feature Selection With Redundancy-Constrained Class Separability

被引:46
|
作者
Zhou, Luping [1 ]
Wang, Lei [1 ]
Shen, Chunhua [2 ]
机构
[1] Australian Natl Univ, Sch Engn, Canberra, ACT 0200, Australia
[2] NICTA, Canberra Res Lab, Canberra, ACT 2601, Australia
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2010年 / 21卷 / 05期
基金
澳大利亚研究理事会;
关键词
Class separability measure; feature redundancy; feature selection; fractional programming; integer programming; CLASSIFICATION; ALGORITHMS;
D O I
10.1109/TNN.2010.2044189
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Scatter-matrix-based class separability is a simple and efficient feature selection criterion in the literature. However, the conventional trace-based formulation does not take feature redundancy into account and is prone to selecting a set of discriminative but mutually redundant features. In this brief, we first theoretically prove that in the context of this trace-based criterion the existence of sufficiently correlated features can always prevent selecting the optimal feature set. Then, on top of this criterion, we propose the redundancy-constrained feature selection (RCFS). To ensure the algorithm's efficiency and scalability, we study the characteristic of the constraints with which the resulted constrained 0-1 optimization can be efficiently and globally solved. By using the totally unimodular (TUM) concept in integer programming, a necessary condition for such constraints is derived. This condition reveals an interesting special case in which qualified redundancy constraints can be conveniently generated via a clustering of features. We study this special case and develop an efficient feature selection approach based on Dinkelbach's algorithm. Experiments on benchmark data sets demonstrate the superior performance of our approach to those without redundancy constraints.
引用
收藏
页码:853 / 858
页数:6
相关论文
共 50 条
  • [1] Redundancy-Constrained Feature Selection with Radial Basis Function Networks
    Pal, Nikhil R.
    Malpani, Mridul
    [J]. 2012 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2012,
  • [2] Feature selection with kernel class separability
    Wang, Lei
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2008, 30 (09) : 1534 - 1546
  • [3] Class separability in spaces reduced by feature selection
    Pranckeviciene, Erinija
    Ho, Tin Kam
    Somorjai, Ray
    [J]. 18TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION, VOL 3, PROCEEDINGS, 2006, : 254 - +
  • [4] Feature subset selection, class separability, and genetic algorithms
    Cantú-Paz, E
    [J]. GENETIC AND EVOLUTIONARY COMPUTATION - GECCO 2004, PT 1, PROCEEDINGS, 2004, 3102 : 959 - 970
  • [5] Maximum Relevance and Class Separability for Hyperspectral Feature Selection and Classification
    Jahanshahi, Saeed
    [J]. 2016 IEEE 10TH INTERNATIONAL CONFERENCE ON APPLICATION OF INFORMATION AND COMMUNICATION TECHNOLOGIES (AICT), 2016, : 202 - 205
  • [6] Entropy and memory constrained vector quantization with separability based feature selection
    Yoon, Sangho
    Gray, Robert M.
    [J]. 2006 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO - ICME 2006, VOLS 1-5, PROCEEDINGS, 2006, : 269 - +
  • [7] Feature selection techniques with class separability for multivariate time series
    Han, Min
    Liu, Xiaoxin
    [J]. NEUROCOMPUTING, 2013, 110 : 29 - 34
  • [8] Hippocampal Shape Classification Using Redundancy Constrained Feature Selection
    Zhou, Luping
    Wang, Lei
    Shen, Chunhua
    Barnes, Nick
    [J]. MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION - MICCAI 2010, PT II,, 2010, 6362 : 266 - +
  • [9] REDUNDANCY-CONSTRAINED MINIMUM-COST DESIGN OF WATER-DISTRIBUTION NETS
    PARK, H
    LIEBMAN, JC
    [J]. JOURNAL OF WATER RESOURCES PLANNING AND MANAGEMENT, 1994, 120 (04) : 570 - 571
  • [10] Redundancy-constrained minimum-cost design of water-distribution nets
    Park, Heekyung
    Liebman, Jon C.
    [J]. Journal of Water Resources Planning and Management, 1993, 119 (01) : 83 - 98