An Improved Feature Selection Algorithm with Conditional Mutual Information for Classification Problems

被引:0
|
作者
Palanichamy, Jaganathan [1 ]
Ramasamy, Kuppuchamy [1 ]
机构
[1] PSNA Coll Engn & Technol, Dept Comp Applicat, Dindigul, Tamil Nadu, India
关键词
Mutual Information; Conditional Mutual Information; Feature Selection; Classification;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The purpose of the feature selection is to eliminate insignificant features from entire dataset and simultaneously to keep the class discriminatory information for classification problems. Many feature selection algorithms have been proposed to measure the relevance and redundancy of the features and class variables. In this paper, we proposed an improved feature selection algorithm based on maximum relevance and minimum redundancy criterion. The relevance of a feature to the class variables are evaluated with mutual information and conditional mutual information is used to calculate the redundancy between the selected and the candidate features to each class variable. The experimental result is tested with five benchmarked datasets available from UCI Machine Learning Repository. The results shows the proposed algorithm is considered quite well when compared with some existing algorithms.
引用
收藏
页数:5
相关论文
共 50 条
  • [31] Conditional mutual information-based feature selection algorithm for maximal relevance minimal redundancy
    Xiangyuan Gu
    Jichang Guo
    Lijun Xiao
    Chongyi Li
    Applied Intelligence, 2022, 52 : 1436 - 1447
  • [32] A Mutual Information estimator for continuous and discrete variables applied to Feature Selection and Classification problems
    Coelho, Frederico
    Braga, Antonio P.
    Verleysen, Michel
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE SYSTEMS, 2016, 9 (04) : 726 - 733
  • [33] A Mutual Information estimator for continuous and discrete variables applied to Feature Selection and Classification problems
    Frederico Coelho
    Antonio P. Braga
    Michel Verleysen
    International Journal of Computational Intelligence Systems, 2016, 9 : 726 - 733
  • [34] Feature Selection for Text Classification Using Mutual Information
    Sel, Ilhami
    Karci, Ali
    Hanbay, Davut
    2019 INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND DATA PROCESSING (IDAP 2019), 2019,
  • [35] An Improved Feature Selection for Categorization Based on Mutual Information
    Liu, Haifeng
    Su, Zhan
    Yao, Zeqing
    Liu, Shousheng
    WEB INFORMATION SYSTEMS AND MINING, PROCEEDINGS, 2009, 5854 : 80 - 87
  • [36] Improved Feature Selection Based On Normalized Mutual Information
    Li Yin
    Ma Xingfei
    Yang Mengxi
    Zhao Wei
    Gu Wenqiang
    14TH INTERNATIONAL SYMPOSIUM ON DISTRIBUTED COMPUTING AND APPLICATIONS FOR BUSINESS, ENGINEERING AND SCIENCE (DCABES 2015), 2015, : 518 - 522
  • [37] Improved Mutual Information Method For Text Feature Selection
    Ding Xiaoming
    Tang Yan
    PROCEEDINGS OF THE 2013 8TH INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE & EDUCATION (ICCSE 2013), 2013, : 163 - 166
  • [38] Feature selection based on conditional mutual information: minimum conditional relevance and minimum conditional redundancy
    Zhou, HongFang
    Zhang, Yao
    Zhang, YingJie
    Liu, HongJiang
    APPLIED INTELLIGENCE, 2019, 49 (03) : 883 - 896
  • [39] Feature selection based on conditional mutual information: minimum conditional relevance and minimum conditional redundancy
    HongFang Zhou
    Yao Zhang
    YingJie Zhang
    HongJiang Liu
    Applied Intelligence, 2019, 49 : 883 - 896
  • [40] Feature selection for multi-label classification by maximizing full-dimensional conditional mutual information
    Zhi-Chao Sha
    Zhang-Meng Liu
    Chen Ma
    Jun Chen
    Applied Intelligence, 2021, 51 : 326 - 340