Is mutual information adequate for feature selection in regression?

被引:53
|
作者
Frenay, Benoit [1 ]
Doquire, Gauthier [1 ]
Verleysen, Michel [1 ]
机构
[1] Catholic Univ Louvain, ICTEAM, Machine Learning Grp, B-1348 Louvain, Belgium
关键词
Mutual information; Feature selection; Regression; MSE; MAE; VARIABLES;
D O I
10.1016/j.neunet.2013.07.003
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is an important preprocessing step for many high-dimensional regression problems. One of the most common strategies is to select a relevant feature subset based on the mutual information criterion. However, no connection has been established yet between the use of mutual information and a regression error criterion in the machine learning literature. This is obviously an important lack, since minimising such a criterion is eventually the objective one is interested in. This paper demonstrates that under some reasonable assumptions, features selected with the mutual information criterion are the ones minimising the mean squared error and the mean absolute error. On the contrary, it is also shown that the mutual information criterion can fail in selecting optimal features in some situations that we characterise. The theoretical developments presented in this work are expected to lead in practice to a critical and efficient use of the mutual information for feature selection. (C) 2013 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1 / 7
页数:7
相关论文
共 50 条
  • [1] Feature Selection with Mutual Information for Regression Problems
    Sulaiman, Muhammad Aliyu
    Labadin, Jane
    [J]. 2015 9TH INTERNATIONAL CONFERENCE ON IT IN ASIA (CITA), 2015,
  • [2] Feature Selection in Regression Tasks Using Conditional Mutual Information
    Latorre Carmona, Pedro
    Sotoca, Jose M.
    Pla, Filiberto
    Phoa, Frederick K. H.
    Dias, Jose Bioucas
    [J]. PATTERN RECOGNITION AND IMAGE ANALYSIS: 5TH IBERIAN CONFERENCE, IBPRIA 2011, 2011, 6669 : 224 - 231
  • [3] Multilabel Feature Selection Based on Fuzzy Mutual Information and Orthogonal Regression
    Dai, Jianhua
    Liu, Qi
    Chen, Wenxiang
    Zhang, Chucai
    [J]. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2024, 32 (09) : 5136 - 5148
  • [4] Quadratic Mutual Information Feature Selection
    Sluga, Davor
    Lotric, Uros
    [J]. ENTROPY, 2017, 19 (04)
  • [5] Weighted Mutual Information for Feature Selection
    Schaffernicht, Erik
    Gross, Horst-Michael
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT II, 2011, 6792 : 181 - 188
  • [6] Mutual Information Criteria for Feature Selection
    Zhang, Zhihong
    Hancock, Edwin R.
    [J]. SIMILARITY-BASED PATTERN RECOGNITION: FIRST INTERNATIONAL WORKSHOP, SIMBAD 2011, 2011, 7005 : 235 - 249
  • [7] Mutual Information Criteria for Feature Selection
    Zhang, Zhihong
    Hancock, Edwin R.
    [J]. SIMILARITY-BASED PATTERN RECOGNITION, 2011, 7005 : 235 - 249
  • [8] Normalized Mutual Information Feature Selection
    Estevez, Pablo. A.
    Tesmer, Michel
    Perez, Claudio A.
    Zurada, Jacek A.
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (02): : 189 - 201
  • [9] Feature selection with dynamic mutual information
    Liu, Huawen
    Sun, Jigui
    Liu, Lei
    Zhang, Huijie
    [J]. PATTERN RECOGNITION, 2009, 42 (07) : 1330 - 1339
  • [10] On Estimating Mutual Information for Feature Selection
    Schaffernicht, Erik
    Kaltenhaeuser, Robert
    Verma, Saurabh Shekhar
    Gross, Horst-Michael
    [J]. ARTIFICIAL NEURAL NETWORKS-ICANN 2010, PT I, 2010, 6352 : 362 - +