MIFS-ND: A mutual information-based feature selection method

被引:251
|
作者
Hoque, N. [1 ]
Bhattacharyya, D. K. [1 ]
Kalita, J. K. [2 ]
机构
[1] Tezpur Univ, Dept Comp Sci & Engn, Tezpur 784028, Assam, India
[2] Univ Colorado, Dept Comp Sci, Colorado Springs, CO 80933 USA
关键词
Features; Mutual information; Relevance; Classification; CLASSIFICATION; ALGORITHM;
D O I
10.1016/j.eswa.2014.04.019
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Feature selection is used to choose a subset of relevant features for effective classification of data. In high dimensional data classification, the performance of a classifier often depends on the feature subset used for classification. In this paper, we introduce a greedy feature selection method using mutual information. This method combines both feature-feature mutual information and feature-class mutual information to find an optimal subset of features to minimize redundancy and to maximize relevance among features. The effectiveness of the selected feature subset is evaluated using multiple classifiers on multiple datasets. The performance of our method both in terms of classification accuracy and execution time performance, has been found significantly high for twelve real-life datasets of varied dimensionality and number of instances when compared with several competing feature selection techniques. (C) 2014 Elsevier Ltd. All rights reserved.
引用
收藏
页码:6371 / 6385
页数:15
相关论文
共 50 条
  • [1] A Fuzzy Mutual Information-based Feature Selection Method for Classification
    Hogue, N.
    Ahmed, H. A.
    Bhattacharyya, D. K.
    Kalita, J. K.
    [J]. FUZZY INFORMATION AND ENGINEERING, 2016, 8 (03) : 355 - 384
  • [2] Mutual information-based feature selection for radiomics
    Oubel, Estanislao
    Beaumont, Hubert
    Iannessi, Antoine
    [J]. MEDICAL IMAGING 2016: PACS AND IMAGING INFORMATICS: NEXT GENERATION AND INNOVATIONS, 2016, 9789
  • [3] Stopping rules for mutual information-based feature selection
    Mielniczuk, Jan
    Teisseyre, Pawel
    [J]. NEUROCOMPUTING, 2019, 358 : 255 - 274
  • [4] Mutual information-based feature selection for multilabel classification
    Doquire, Gauthier
    Verleysen, Michel
    [J]. NEUROCOMPUTING, 2013, 122 : 148 - 155
  • [5] A Study on Mutual Information-Based Feature Selection in Classifiers
    Arundhathi, B.
    Athira, A.
    Rajan, Ranjidha
    [J]. ARTIFICIAL INTELLIGENCE AND EVOLUTIONARY COMPUTATIONS IN ENGINEERING SYSTEMS, ICAIECES 2016, 2017, 517 : 479 - 486
  • [6] CONDITIONAL DYNAMIC MUTUAL INFORMATION-BASED FEATURE SELECTION
    Liu, Huawen
    Mo, Yuchang
    Zhao, Jianmin
    [J]. COMPUTING AND INFORMATICS, 2012, 31 (06) : 1193 - 1216
  • [7] X-MIFS: Exact Mutual Information for Feature Selection
    Brunato, Mauro
    Battiti, Roberto
    [J]. 2016 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2016, : 3469 - 3476
  • [8] Sentiment Classification in Persian: Introducing a Mutual Information-based Method for Feature Selection
    Bagheri, Ayoub
    Saraee, Mohamad
    de Jong, Franciska
    [J]. 2013 21ST IRANIAN CONFERENCE ON ELECTRICAL ENGINEERING (ICEE), 2013,
  • [9] Comparison of Mutual Information-based Feature Selection Method for Biological Omics Datasets
    Huang, Zhijun
    [J]. 2021 8TH INTERNATIONAL CONFERENCE ON SOFT COMPUTING & MACHINE INTELLIGENCE (ISCMI 2021), 2021, : 60 - 63
  • [10] Feature redundancy term variation for mutual information-based feature selection
    Gao, Wanfu
    Hu, Liang
    Zhang, Ping
    [J]. APPLIED INTELLIGENCE, 2020, 50 (04) : 1272 - 1288