Multi-label feature selection based on minimizing feature redundancy of mutual information

被引:1
|
作者
Zhou, Gaozhi [2 ]
Li, Runxin [1 ,2 ]
Shang, Zhenhong [2 ]
Li, Xiaowu [2 ]
Jia, Lianyin [2 ]
机构
[1] Kunming Univ Sci & Technol, Yunnan Key Lab Comp Technol Applicat, Kunming 650500, Peoples R China
[2] Kunming Univ Sci & Technol, Fac Informat Engn & Automat, Kunming 650500, Peoples R China
基金
中国国家自然科学基金;
关键词
Multi-label feature selection; Mutual information; Sparse model; Redundant correlation; OPTIMIZATION ALGORITHM; SHRINKAGE; COMMON;
D O I
10.1016/j.neucom.2024.128392
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Multi-label feature selection is an indispensable technology in the preprocessing of multi-label high-dimensional data. Approaches utilizing information theory and sparse models hold promise in this domain, demonstrating strong performance. Although there have been extensive literatures using l 1 and l 2 , 1-norms to identify label- specific features and common features in the feature space, they all ignore the redundant information interference problem when different features are learned simultaneously. Considering that features and labels in multi-label data are rarely linearly correlated, the MFS-MFR approach is presented to generate a representation of the nonlinear correlation between features and labels using the mutual information estimator. Following that, MFS-MFR detects specific and common features in the feature-label mutual information space using two coefficient matrices constrained by the l 1 and l 2 , 1-norms, respectively. In particular, we define a nonzero correlation constraint that effectively minimizes the redundant correlation between the two matrices. Moreover, a manifold regularization term is devised to preserve the local information of the mutual information space. To solve the optimization model with nonlinear binary regular term, we employ a novel solution approach called S-FISTA. Extensive experiments across 15 multi-label benchmark datasets, comparing against 11 top-performing multi-label feature selection methods, demonstrate the superior performance of MFS-MFR.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] MFC: Initialization method for multi-label feature selection based on conditional mutual information
    Lim, Hyunki
    Kim, Dae-Won
    NEUROCOMPUTING, 2020, 382 : 40 - 51
  • [22] Mutual information based multi-label feature selection via constrained convex optimization
    Sun, Zhenqiang
    Zhang, Jia
    Dai, Liang
    Li, Candong
    Zhou, Changen
    Xin, Jiliang
    Li, Shaozi
    NEUROCOMPUTING, 2019, 329 : 447 - 456
  • [23] Feature relevance and redundancy coefficients for multi-view multi-label feature selection
    Han, Qingqi
    Hu, Liang
    Gao, Wanfu
    INFORMATION SCIENCES, 2024, 652
  • [24] Distributed multi-label feature selection using individual mutual information measures
    Gonzalez-Lopez, Jorge
    Ventura, Sebastian
    Cano, Alberto
    KNOWLEDGE-BASED SYSTEMS, 2020, 188 (188)
  • [25] Multi-label feature selection algorithm based on information entropy
    Zhang, Zhenhai
    Li, Shining
    Li, Zhigang
    Chen, Hao
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2013, 50 (06): : 1177 - 1184
  • [26] Multi-label feature selection based on label distribution and feature complementarity
    Qian, Wenbin
    Long, Xuandong
    Wang, Yinglong
    Xie, Yonghong
    APPLIED SOFT COMPUTING, 2020, 90
  • [27] Multi-label feature selection for missing labels by granular-ball based mutual information
    Shu, Wenhao
    Hu, Yichen
    Qian, Wenbin
    APPLIED INTELLIGENCE, 2024, 54 (23) : 12589 - 12612
  • [28] Multi-label Feature Selection Method Based on Multivariate Mutual Information and Particle Swarm Optimization
    Wang, Xidong
    Zhao, Lei
    Xu, Jianhua
    NEURAL INFORMATION PROCESSING (ICONIP 2018), PT IV, 2018, 11304 : 84 - 95
  • [29] Fast multi-label feature selection based on information-theoretic feature ranking
    Lee, Jaesung
    Kim, Dae-Won
    PATTERN RECOGNITION, 2015, 48 (09) : 2761 - 2771
  • [30] Label relaxation and shared information for multi-label feature selection
    Fan, Yuling
    Chen, Xu
    Luo, Shimu
    Liu, Peizhong
    Liu, Jinghua
    Chen, Baihua
    Tang, Jianeng
    INFORMATION SCIENCES, 2024, 671