A General Framework for Class Label Specific Mutual Information Feature Selection Method

被引:10
|
作者
Rakesh, Deepak Kumar [1 ]
Jana, Prasanta K. [1 ]
机构
[1] Indian Inst Technol ISM Dhanbad, Dept Comp Sci & Engn, Dhanbad 826004, Bihar, India
关键词
Mutual information; Feature extraction; Redundancy; Entropy; Magnetic resonance imaging; Information filters; Correlation; Feature selection; filter method; information theory; class label specific mutual information; classification; DEPENDENCY; RELEVANCE; SMOTE;
D O I
10.1109/TIT.2022.3188708
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Information theory-based feature selection (ITFS) methods select a single subset of features for all classes based on the following criteria: 1) minimizing redundancy between the selected features and 2) maximizing classification information of the selected features with the classes. A critical issue with selecting a single subset of features is that they may not represent the feature space in which individual class labels can be separated exclusively. Existing methods fail to provide a way to select the feature space specific to the individual class label. To this end, we propose a novel feature selection method called class-label specific mutual information (CSMI) that selects a specific set of features for each class label. The proposed method maximizes the information shared among the selected features and target class label but minimizes the same with all classes. We also consider the dynamic change of information between selected features and the target class label when a candidate feature is added. Finally, we provide a general framework for the CSMI to make it classifier-independent. We perform experiments on sixteen benchmark data sets using four classifiers and found that the CSMI outperforms five traditional, two state-of-the-art ITFS (multi-class classification), and one multi-label classification methods.
引用
收藏
页码:7996 / 8014
页数:19
相关论文
共 50 条
  • [1] Class-specific mutual information variation for feature selection
    Gao, Wanfu
    Hu, Liang
    Zhang, Ping
    PATTERN RECOGNITION, 2018, 79 : 328 - 339
  • [2] General framework for class-specific feature selection
    Pineda-Bautista, Barbara B.
    Carrasco-Ochoa, J. A.
    Fco Martinez-Trinidad, J.
    EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (08) : 10018 - 10024
  • [3] Feature-specific mutual information variation for multi-label feature selection
    Hu, Liang
    Gao, Lingbo
    Li, Yonghao
    Zhang, Ping
    Gao, Wanfu
    INFORMATION SCIENCES, 2022, 593 : 449 - 471
  • [4] A Fast Feature Selection Method Based on Mutual Information in Multi-label Learning
    Sun, Zhenqiang
    Zhang, Jia
    Luo, Zhiming
    Cao, Donglin
    Li, Shaozi
    COMPUTER SUPPORTED COOPERATIVE WORK AND SOCIAL COMPUTING, CHINESECSCW 2018, 2019, 917 : 424 - 437
  • [5] Approximating mutual information for multi-label feature selection
    Lee, J.
    Lim, H.
    Kim, D. -W.
    ELECTRONICS LETTERS, 2012, 48 (15) : 929 - 930
  • [6] Multi-Label Feature Selection with Conditional Mutual Information
    Wang, Xiujuan
    Zhou, Yuchen
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [7] Estimating mutual information for feature selection in the presence of label noise
    Frenay, Benoit
    Doquire, Gauthier
    Verleysen, Michel
    COMPUTATIONAL STATISTICS & DATA ANALYSIS, 2014, 71 : 832 - 848
  • [8] Feature selection based on label distribution and fuzzy mutual information
    Xiong, Chuanzhen
    Qian, Wenbin
    Wang, Yinglong
    Huang, Jintao
    INFORMATION SCIENCES, 2021, 574 : 297 - 319
  • [9] Partial label feature selection via label disambiguation and neighborhood mutual information
    Ding, Jinfei
    Qian, Wenbin
    Li, Yihui
    Yang, Wenji
    Huang, Jintao
    INFORMATION SCIENCES, 2024, 680
  • [10] A novel framework for multi-label feature selection: integrating mutual information and Pythagorean fuzzy CRADIS
    Mohanrasu S.S.
    Rakkiyappan R.
    Granular Computing, 2024, 9 (03)