Mutual information-based method for selecting informative feature sets

被引:54
|
作者
Herman, Gunawan [1 ,2 ]
Zhang, Bang [1 ,2 ]
Wang, Yang [1 ,2 ]
Ye, Getian [3 ]
Chen, Fang [1 ,2 ]
机构
[1] Natl ICT Australia, Eveleigh, NSW 2015, Australia
[2] Univ New S Wales, Sydney, NSW 2052, Australia
[3] Canon Informat Syst Res Australia, N Ryde, NSW 2113, Australia
基金
澳大利亚研究理事会;
关键词
Feature selection; Mutual information; DEPENDENCY; EXTRACTION; PATTERNS;
D O I
10.1016/j.patcog.2013.04.021
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is one of the fundamental problems in pattern recognition and data mining. A popular and effective approach to feature selection is based on information theory, namely the mutual information of features and class variable. In this paper we compare eight different mutual information-based feature selection methods. Based on the analysis of the comparison results, we propose a new mutual information-based feature selection method. By taking into account both the class-dependent and class-independent correlation among features, the proposed method selects a less redundant and more informative set of features. The advantage of the proposed method over other methods is demonstrated by the results of experiments on UCI datasets (Asuncion and Newman, 2010 [1]) and object recognition. (C) 2013 Elsevier Ltd. All rights reserved.
引用
收藏
页码:3315 / 3327
页数:13
相关论文
共 50 条
  • [1] A Fuzzy Mutual Information-based Feature Selection Method for Classification
    Hogue, N.
    Ahmed, H. A.
    Bhattacharyya, D. K.
    Kalita, J. K.
    [J]. FUZZY INFORMATION AND ENGINEERING, 2016, 8 (03) : 355 - 384
  • [2] Mutual information-based feature selection for radiomics
    Oubel, Estanislao
    Beaumont, Hubert
    Iannessi, Antoine
    [J]. MEDICAL IMAGING 2016: PACS AND IMAGING INFORMATICS: NEXT GENERATION AND INNOVATIONS, 2016, 9789
  • [3] MIFS-ND: A mutual information-based feature selection method
    Hoque, N.
    Bhattacharyya, D. K.
    Kalita, J. K.
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2014, 41 (14) : 6371 - 6385
  • [4] Stopping rules for mutual information-based feature selection
    Mielniczuk, Jan
    Teisseyre, Pawel
    [J]. NEUROCOMPUTING, 2019, 358 : 255 - 274
  • [5] Comparison of Mutual Information-based Feature Selection Method for Biological Omics Datasets
    Huang, Zhijun
    [J]. 2021 8TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2021), 2021, : 60 - 63
  • [6] Sentiment Classification in Persian: Introducing a Mutual Information-based Method for Feature Selection
    Bagheri, Ayoub
    Saraee, Mohamad
    de Jong, Franciska
    [J]. 2013 21ST IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE), 2013,
  • [7] Mutual information-based feature selection for multilabel classification
    Doquire, Gauthier
    Verleysen, Michel
    [J]. NEUROCOMPUTING, 2013, 122 : 148 - 155
  • [8] A Study on Mutual Information-Based Feature Selection in Classifiers
    Arundhathi, B.
    Athira, A.
    Rajan, Ranjidha
    [J]. ARTIFICIAL INTELLIGENCE AND EVOLUTIONARY COMPUTATIONS IN ENGINEERING SYSTEMS, ICAIECES 2016, 2017, 517 : 479 - 486
  • [9] CONDITIONAL DYNAMIC MUTUAL INFORMATION-BASED FEATURE SELECTION
    Liu, Huawen
    Mo, Yuchang
    Zhao, Jianmin
    [J]. COMPUTING AND INFORMATICS, 2012, 31 (06) : 1193 - 1216
  • [10] Feature redundancy term variation for mutual information-based feature selection
    Gao, Wanfu
    Hu, Liang
    Zhang, Ping
    [J]. APPLIED INTELLIGENCE, 2020, 50 (04) : 1272 - 1288