Efficient Leave-One-Out Strategy for Supervised Feature Selection

被引:0
|
作者
Dingcheng Feng [1 ]
Feng Chen [1 ]
Wenli Xu [1 ]
机构
[1] National Laboratory for Information Science and Technology,Tsinghua University
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
leave-one-out; feature selection objectives; evaluation metrics;
D O I
暂无
中图分类号
TP301.6 [算法理论];
学科分类号
081202 ;
摘要
Feature selection is a key task in statistical pattern recognition.Most feature selection algorithms have been proposed based on specific objective functions which are usually intuitively reasonable but can sometimes be far from the more basic objectives of the feature selection.This paper describes how to select features such that the basic objectives,e.g.,classification or clustering accuracies,can be optimized in a more direct way.The analysis requires that the contribution of each feature to the evaluation metrics can be quantitatively described by some score function.Motivated by the conditional independence structure in probabilistic distributions,the analysis uses a leave-one-out feature selection algorithm which provides an approximate solution.The leave-oneout algorithm improves the conventional greedy backward elimination algorithm by preserving more interactions among features in the selection process,so that the various feature selection objectives can be optimized in a unified way.Experiments on six real-world datasets with different feature evaluation metrics have shown that this algorithm outperforms popular feature selection algorithms in most situations.
引用
收藏
页码:629 / 635
页数:7
相关论文
共 50 条
  • [21] Leave-one-out procedures for nonparametric error estimates
    [J]. Fukunaga, Keinosuke, 1600, (11):
  • [22] Limitations of “Limitations of Bayesian Leave-one-out Cross-Validation for Model Selection”
    Vehtari A.
    Simpson D.P.
    Yao Y.
    Gelman A.
    [J]. Computational Brain & Behavior, 2019, 2 (1) : 22 - 27
  • [23] Efficient leave-one-out cross-validation of kernel Fisher discriminant classifiers
    Cawley, GC
    Talbot, NLC
    [J]. PATTERN RECOGNITION, 2003, 36 (11) : 2585 - 2592
  • [24] Efficient approximate leave-one-out cross-validation for kernel logistic regression
    Cawley, Gavin C.
    Talbot, Nicola L. C.
    [J]. MACHINE LEARNING, 2008, 71 (2-3) : 243 - 264
  • [25] Leave-one-out GFP: Folding, Stability and Design
    Pitman, Derek
    Huang, Yao-ming
    Rosenman, David
    Crone, Donna
    Bystroff, Chris
    [J]. PROTEIN SCIENCE, 2012, 21 : 83 - 83
  • [26] Cross-validation of best linear unbiased predictions of breeding values using an efficient leave-one-out strategy
    Cheng, Jian
    Dekkers, Jack C. M.
    Fernando, Rohan L.
    [J]. JOURNAL OF ANIMAL BREEDING AND GENETICS, 2021, 138 (05) : 519 - 527
  • [27] On Leave-One-Out Conditional Mutual Information For Generalization
    Rammal, Mohamad Rida
    Achille, Alessandro
    Golatkar, Aditya
    Diggavi, Suhas
    Soatto, Stefano
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [28] Leave-One-Out Kernel Optimization for Shadow Detection
    Vicente, Tomas F. Yago
    Hoai, Minh
    Samaras, Dimitris
    [J]. 2015 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2015, : 3388 - 3396
  • [29] Leave-one-out bounds for Support Vector Regression
    Tian, Yingjie
    Deng, Naiyang
    [J]. INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE FOR MODELLING, CONTROL & AUTOMATION JOINTLY WITH INTERNATIONAL CONFERENCE ON INTELLIGENT AGENTS, WEB TECHNOLOGIES & INTERNET COMMERCE, VOL 2, PROCEEDINGS, 2006, : 1061 - +
  • [30] Dichotomous logistic regression with leave-one-out validation
    Teh, Sin Yin
    Othman, Abdul Rahman
    Khoo, Michael Boon Chong
    [J]. World Academy of Science, Engineering and Technology, 2010, 62 : 1001 - 1010