A General Framework for Class Label Specific Mutual Information Feature Selection Method

被引:10
|
作者
Rakesh, Deepak Kumar [1 ]
Jana, Prasanta K. [1 ]
机构
[1] Indian Inst Technol ISM Dhanbad, Dept Comp Sci & Engn, Dhanbad 826004, Bihar, India
关键词
Mutual information; Feature extraction; Redundancy; Entropy; Magnetic resonance imaging; Information filters; Correlation; Feature selection; filter method; information theory; class label specific mutual information; classification; DEPENDENCY; RELEVANCE; SMOTE;
D O I
10.1109/TIT.2022.3188708
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Information theory-based feature selection (ITFS) methods select a single subset of features for all classes based on the following criteria: 1) minimizing redundancy between the selected features and 2) maximizing classification information of the selected features with the classes. A critical issue with selecting a single subset of features is that they may not represent the feature space in which individual class labels can be separated exclusively. Existing methods fail to provide a way to select the feature space specific to the individual class label. To this end, we propose a novel feature selection method called class-label specific mutual information (CSMI) that selects a specific set of features for each class label. The proposed method maximizes the information shared among the selected features and target class label but minimizes the same with all classes. We also consider the dynamic change of information between selected features and the target class label when a candidate feature is added. Finally, we provide a general framework for the CSMI to make it classifier-independent. We perform experiments on sixteen benchmark data sets using four classifiers and found that the CSMI outperforms five traditional, two state-of-the-art ITFS (multi-class classification), and one multi-label classification methods.
引用
收藏
页码:7996 / 8014
页数:19
相关论文
共 50 条
  • [31] Mutual Information-based multi-label feature selection using interaction information
    Lee, Jaesung
    Kim, Dae-Won
    EXPERT SYSTEMS WITH APPLICATIONS, 2015, 42 (04) : 2013 - 2025
  • [32] Weighted Mutual Information for Feature Selection
    Schaffernicht, Erik
    Gross, Horst-Michael
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2011, PT II, 2011, 6792 : 181 - 188
  • [33] Quadratic Mutual Information Feature Selection
    Sluga, Davor
    Lotric, Uros
    ENTROPY, 2017, 19 (04)
  • [34] Mutual Information Criteria for Feature Selection
    Zhang, Zhihong
    Hancock, Edwin R.
    SIMILARITY-BASED PATTERN RECOGNITION: FIRST INTERNATIONAL WORKSHOP, SIMBAD 2011, 2011, 7005 : 235 - 249
  • [35] Normalized Mutual Information Feature Selection
    Estevez, Pablo. A.
    Tesmer, Michel
    Perez, Claudio A.
    Zurada, Jacek A.
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2009, 20 (02): : 189 - 201
  • [36] Mutual Information Criteria for Feature Selection
    Zhang, Zhihong
    Hancock, Edwin R.
    SIMILARITY-BASED PATTERN RECOGNITION, 2011, 7005 : 235 - 249
  • [37] On Estimating Mutual Information for Feature Selection
    Schaffernicht, Erik
    Kaltenhaeuser, Robert
    Verma, Saurabh Shekhar
    Gross, Horst-Michael
    ARTIFICIAL NEURAL NETWORKS-ICANN 2010, PT I, 2010, 6352 : 362 - +
  • [38] Feature selection with dynamic mutual information
    Liu, Huawen
    Sun, Jigui
    Liu, Lei
    Zhang, Huijie
    PATTERN RECOGNITION, 2009, 42 (07) : 1330 - 1339
  • [39] An improved feature selection method based on angle-guided multi-objective PSO and feature-label mutual information
    Han, Fei
    Wang, Tianyi
    Ling, Qinghua
    APPLIED INTELLIGENCE, 2023, 53 (03) : 3545 - 3562
  • [40] An improved feature selection method based on angle-guided multi-objective PSO and feature-label mutual information
    Fei Han
    Tianyi Wang
    Qinghua Ling
    Applied Intelligence, 2023, 53 : 3545 - 3562