FSCME: A Feature Selection Method Combining Copula Correlation and Maximal Information Coefficient by Entropy Weights

被引:0
|
作者
Zhong, Qi [1 ]
Shang, Junliang [1 ]
Ren, Qianqian [1 ]
Li, Feng [1 ]
Jiao, Cui-Na [1 ]
Liu, Jin-Xing [1 ]
机构
[1] Qufu Normal Univ, Sch Comp Sci, Rizhao 276826, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Power capacitors; Mutual information; Correlation; Indexes; Microwave integrated circuits; Entropy; Copula; entropy; feature selection; gene selection; mutual information; MUTUAL INFORMATION; RELEVANCE;
D O I
10.1109/JBHI.2024.3409628
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature selection is a critical component of data mining and has garnered significant attention in recent years. However, feature selection methods based on information entropy often introduce complex mutual information forms to measure features, leading to increased redundancy and potential errors. To address this issue, we propose FSCME, a feature selection method combining Copula correlation (Ccor) and the maximum information coefficient (MIC) by entropy weights. The FSCME takes into consideration the relevance between features and labels, as well as the redundancy among candidate features and selected features. Therefore, the FSCME utilizes Ccor to measure the redundancy between features, while also estimating the relevance between features and labels. Meanwhile, the FSCME employs MIC to enhance the credibility of the correlation between features and labels. Moreover, this study employs the Entropy Weight Method (EWM) to evaluate and assign weights to the Ccor and MIC. The experimental results demonstrate that FSCME yields a more effective feature subset for subsequent clustering processes, significantly improving the classification performance compared to the other six feature selection methods.
引用
收藏
页码:5638 / 5648
页数:11
相关论文
共 50 条
  • [1] Feature subset selection combining maximal information entropy and maximal information coefficient
    Zheng, Kangfeng
    Wang, Xiujuan
    Wu, Bin
    Wu, Tong
    APPLIED INTELLIGENCE, 2020, 50 (02) : 487 - 501
  • [2] Feature subset selection combining maximal information entropy and maximal information coefficient
    Kangfeng Zheng
    Xiujuan Wang
    Bin Wu
    Tong Wu
    Applied Intelligence, 2020, 50 : 487 - 501
  • [3] Feature Selection Method Based on Differential Correlation Information Entropy
    Wang, Xiujuan
    Yan, Yixuan
    Ma, Xiaoyue
    NEURAL PROCESSING LETTERS, 2020, 52 (02) : 1339 - 1358
  • [4] Feature Selection Method Based on Differential Correlation Information Entropy
    Xiujuan Wang
    Yixuan Yan
    Xiaoyue Ma
    Neural Processing Letters, 2020, 52 : 1339 - 1358
  • [5] Feature selection for IoT based on maximal information coefficient
    Sun, Guanglu
    Li, Jiabin
    Dai, Jian
    Song, Zhichao
    Lang, Fei
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2018, 89 : 606 - 616
  • [6] Feature Selection with Attributes Clustering by Maximal Information Coefficient
    Zhao, Xi
    Deng, Wei
    Shi, Yong
    FIRST INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY AND QUANTITATIVE MANAGEMENT, 2013, 17 : 70 - 79
  • [7] Feature selection method with joint maximal information entropy between features and class
    Zheng, Kangfeng
    Wang, Xiujuan
    PATTERN RECOGNITION, 2018, 77 : 20 - 29
  • [8] Feature gene selection method based on logistic and correlation information entropy
    Xu, Jiucheng
    Li, Tao
    Sun, Lin
    BIO-MEDICAL MATERIALS AND ENGINEERING, 2015, 26 : S1953 - S1959
  • [9] Feature selection based on mutual information with correlation coefficient
    Hongfang Zhou
    Xiqian Wang
    Rourou Zhu
    Applied Intelligence, 2022, 52 : 5457 - 5474
  • [10] Feature selection based on mutual information with correlation coefficient
    Zhou, Hongfang
    Wang, Xiqian
    Zhu, Rourou
    APPLIED INTELLIGENCE, 2022, 52 (05) : 5457 - 5474