Sparse feature selection via local feature and high-order label correlation

被引:9
|
作者
Sun, Lin [1 ,2 ]
Ma, Yuxuan [2 ]
Ding, Weiping [3 ]
Xu, Jiucheng [2 ]
机构
[1] Tianjin Univ Sci & Technol, Coll Artificial Intelligence, Tianjin 300457, Peoples R China
[2] Henan Normal Univ, Coll Comp & Informat Engn, Xinxiang 453007, Peoples R China
[3] Nantong Univ, Sch Informat Sci & Technol, Nantong 226019, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature selection; Loss function; Manifold learning; High-order label correlation; NEIGHBORHOOD ROUGH SETS; SCORE;
D O I
10.1007/s10489-023-05136-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, some existing feature selection approaches neglect the correlation among labels, and almost manifold-based multilabel learning models do not considered the relationship between features and labels, which result in reducing classification effect. To overcome the shortcomings, our work develops a fresh sparse feature selection approach via local feature and high-order label correlation. First, the sparse processing is performed by applying the l2,1 norm on the weight coefficient matrix, and a loss function between the sample and label matrices can be established to explore the potential relationship between features and labels. The designed loss function can be directly sparse by the weight coefficient matrix, which calculates this weight of each feature, and then some features with higher scores are selected. Second, the combination of manifold learning and Laplacian score is used to deal with local features to make full use of local feature correlation. The manifold regularization for the embedded feature selection can guide exploring potential real labels and selecting different features for individual labels. Finally, to safeguard the rank of the high-order label matrix from being damaged, a self-representation strategy is employed. Then the high-order label weight matrix and the label error term are defined to enhance the accuracy of label self-representation and correct the deviation between the self-representation scheme and the real label, and the Frobenius and the l2 norm regularization can avoid those trivial solutions and overfitting issues. A representation function of high-order label correlation is proposed based on the self-representation strategy, which can accurately represent the potential information between high-order labels. Thus, those local features are scored by the Laplacian score for different features to select an optimal feature subset with higher scores. Experiments on 16 multilabel datasets illustrate that our constructed algorithm will be efficient in obtaining important feature set and implementing powerful classification efficacy on multilabel classification.
引用
收藏
页码:565 / 591
页数:27
相关论文
共 50 条
  • [31] Unsupervised feature selection via joint local learning and group sparse regression
    Yue Wu
    Can Wang
    Yue-qing Zhang
    Jia-jun Bu
    Frontiers of Information Technology & Electronic Engineering, 2019, 20 : 538 - 553
  • [32] Unsupervised feature selection via joint local learning and group sparse regression
    Wu, Yue
    Wang, Can
    Zhang, Yue-qing
    Bu, Jia-jun
    FRONTIERS OF INFORMATION TECHNOLOGY & ELECTRONIC ENGINEERING, 2019, 20 (04) : 538 - 553
  • [33] Unsupervised Feature Selection via Local Total-Order Preservation
    Ma, Rui
    Wang, Yijie
    Cheng, Li
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: DEEP LEARNING, PT II, 2019, 11728 : 16 - 28
  • [34] Sparse Matrix Feature Selection in Multi-label Learning
    Yang, Wenyuan
    Zhou, Bufang
    Zhu, William
    ROUGH SETS, FUZZY SETS, DATA MINING, AND GRANULAR COMPUTING, RSFDGRC 2015, 2015, 9437 : 332 - 339
  • [35] Disambiguation-based partial label feature selection via feature dependency and label consistency
    Qian, Wenbin
    Li, Yihui
    Ye, Qianzhi
    Ding, Weiping
    Shu, Wenhao
    INFORMATION FUSION, 2023, 94 : 152 - 168
  • [36] Multi-label feature selection via label relaxation
    Fan, Yuling
    Liu, Peizhong
    Liu, Jinghua
    APPLIED SOFT COMPUTING, 2025, 175
  • [37] Assessing high-order effects in feature importance via predictability decomposition
    Ontivero-Ortega, Marlis
    Faes, Luca
    Cortes, Jesus M.
    Marinazzo, Daniele
    Stramaglia, Sebastiano
    PHYSICAL REVIEW E, 2025, 111 (03)
  • [38] Two-Dimensional Unsupervised Feature Selection via Sparse Feature Filter
    Li, Junyu
    Chen, Jiazhou
    Qi, Fei
    Dan, Tingting
    Weng, Wanlin
    Zhang, Bin
    Yuan, Haoliang
    Cai, Hongmin
    Zhong, Cheng
    IEEE TRANSACTIONS ON CYBERNETICS, 2023, 53 (09) : 5605 - 5617
  • [39] Weakly-supervised label distribution feature selection via label-specific features and label correlation
    Shu, Wenhao
    Hu, Jiayu
    Qian, Wenbin
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2024, : 2181 - 2201
  • [40] Online Multi-Label Streaming Feature Selection Based on Label Group Correlation and Feature Interaction
    Liu, Jinghua
    Yang, Songwei
    Zhang, Hongbo
    Sun, Zhenzhen
    Du, Jixiang
    ENTROPY, 2023, 25 (07)