High-order conditional mutual information maximization for dealing with high-order dependencies in feature selection

被引:19
|
作者
Souza, Francisco [1 ]
Premebida, Cristiano [1 ]
Araujo, Rui [1 ]
机构
[1] Univ Coimbra, Inst Syst & Robot, Coimbra, Portugal
关键词
Feature selection; Mutual information; Information theory; Pattern recognition; SHRINKAGE;
D O I
10.1016/j.patcog.2022.108895
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a novel feature selection method based on the conditional mutual information (CMI). The proposed High Order Conditional Mutual Information Maximization (HOCMIM) method incorporates high order dependencies into the feature selection procedure and has a straightforward interpretation due to its bottom-up derivation. The HOCMIM is derived from the CMI's chain expansion and expressed as a maximization optimization problem. The maximization problem is solved using a greedy search pro-cedure, which speeds up the entire feature selection process. The experiments are run on a set of bench-mark datasets (20 in total). The HOCMIM is compared with eighteen state-of-the-art feature selection al-gorithms, from the results of two supervised learning classifiers (Support Vector Machine and K-Nearest Neighbor). The HOCMIM achieves the best results in terms of accuracy and shows to be faster than high order feature selection counterparts. (c) 2022 Elsevier Ltd. All rights reserved.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Can high-order dependencies improve mutual information based feature selection?
    Nguyen Xuan Vinh
    Zhou, Shuo
    Chan, Jeffrey
    Bailey, James
    PATTERN RECOGNITION, 2016, 53 : 46 - 58
  • [2] Discretization and Feature Selection Based on Bias Corrected Mutual Information Considering High-Order Dependencies
    Roy, Puloma
    Sharmin, Sadia
    Ali, Amin Ahsan
    Shoyaib, Mohammad
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2020, PT I, 2020, 12084 : 830 - 842
  • [3] Efficient High-Order Interaction-Aware Feature Selection Based on Conditional Mutual Information
    Shishkin, Alexander
    Bezzubtseva, Anastasia
    Drutsa, Alexey
    Shishkov, Ilia
    Gladkikh, Ekaterina
    Gusev, Gleb
    Serdyukov, Pavel
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [4] Probing High-Order Dependencies With Information Theory
    Granero-Belinchon, Carlos
    Roux, Stephane G.
    Abry, Patrice
    Gamier, Nicolas B.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2019, 67 (14) : 3796 - 3805
  • [5] HIGH-ORDER CONDITIONAL DISTANCE COVARIANCE WITH CONDITIONAL MUTUAL INDEPENDENCE
    Liu, Pengfei
    Ma, Xuejun
    Zhou, Wang
    PROBABILITY IN THE ENGINEERING AND INFORMATIONAL SCIENCES, 2022, 36 (01) : 126 - 143
  • [6] Conditional random field with high-order dependencies for sequence labeling and segmentation
    Cuong, Nguyen Viet
    Ye, Nan
    Lee, Wee Sun
    Chieu, Hai Leong
    Journal of Machine Learning Research, 2014, 15 : 981 - 1009
  • [7] Conditional Random Field with High-order Dependencies for Sequence Labeling and Segmentation
    Nguyen Viet Cuong
    Ye, Nan
    Lee, Wee Sun
    Chieu, Hai Leong
    JOURNAL OF MACHINE LEARNING RESEARCH, 2014, 15 : 981 - 1009
  • [8] High-order covariate interacted Lasso for feature selection
    Zhang, Zhihong
    Tian, Yiyang
    Bai, Lu
    Xiahou, Jianbing
    Hancock, Edwin
    PATTERN RECOGNITION LETTERS, 2017, 87 : 139 - 146
  • [9] Unsupervised feature selection with high-order similarity learning
    Mi, Yong
    Chen, Hongmei
    Luo, Chuan
    Horng, Shi-Jinn
    Li, Tianrui
    KNOWLEDGE-BASED SYSTEMS, 2024, 285
  • [10] Deep clustering via high-order mutual information maximization and pseudo-label guidance
    Liu C.
    Kong B.
    Du G.-W.
    Zhou L.-H.
    Chen H.-M.
    Bao C.-M.
    Zhejiang Daxue Xuebao (Gongxue Ban)/Journal of Zhejiang University (Engineering Science), 2023, 57 (02): : 299 - 309