Sparse feature selection via local feature and high-order label correlation

被引:9
|
作者
Sun, Lin [1 ,2 ]
Ma, Yuxuan [2 ]
Ding, Weiping [3 ]
Xu, Jiucheng [2 ]
机构
[1] Tianjin Univ Sci & Technol, Coll Artificial Intelligence, Tianjin 300457, Peoples R China
[2] Henan Normal Univ, Coll Comp & Informat Engn, Xinxiang 453007, Peoples R China
[3] Nantong Univ, Sch Informat Sci & Technol, Nantong 226019, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature selection; Loss function; Manifold learning; High-order label correlation; NEIGHBORHOOD ROUGH SETS; SCORE;
D O I
10.1007/s10489-023-05136-9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recently, some existing feature selection approaches neglect the correlation among labels, and almost manifold-based multilabel learning models do not considered the relationship between features and labels, which result in reducing classification effect. To overcome the shortcomings, our work develops a fresh sparse feature selection approach via local feature and high-order label correlation. First, the sparse processing is performed by applying the l2,1 norm on the weight coefficient matrix, and a loss function between the sample and label matrices can be established to explore the potential relationship between features and labels. The designed loss function can be directly sparse by the weight coefficient matrix, which calculates this weight of each feature, and then some features with higher scores are selected. Second, the combination of manifold learning and Laplacian score is used to deal with local features to make full use of local feature correlation. The manifold regularization for the embedded feature selection can guide exploring potential real labels and selecting different features for individual labels. Finally, to safeguard the rank of the high-order label matrix from being damaged, a self-representation strategy is employed. Then the high-order label weight matrix and the label error term are defined to enhance the accuracy of label self-representation and correct the deviation between the self-representation scheme and the real label, and the Frobenius and the l2 norm regularization can avoid those trivial solutions and overfitting issues. A representation function of high-order label correlation is proposed based on the self-representation strategy, which can accurately represent the potential information between high-order labels. Thus, those local features are scored by the Laplacian score for different features to select an optimal feature subset with higher scores. Experiments on 16 multilabel datasets illustrate that our constructed algorithm will be efficient in obtaining important feature set and implementing powerful classification efficacy on multilabel classification.
引用
收藏
页码:565 / 591
页数:27
相关论文
共 50 条
  • [21] Unsupervised feature selection via local structure learning and sparse learning
    Lei, Cong
    Zhu, Xiaofeng
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (22) : 29605 - 29622
  • [22] Multi-Label Feature Selection With Missing Features via Implicit Label Replenishment and Positive Correlation Feature Recovery
    Dai, Jianhua
    Chen, Wenxiang
    Qian, Yuhua
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2025, 37 (04) : 2042 - 2055
  • [23] Novel multi-label feature selection via label symmetric uncertainty correlation learning and feature redundancy evaluation
    Dai, Jianhua
    Chen, Jiaolong
    Liu, Ye
    Hu, Hu
    KNOWLEDGE-BASED SYSTEMS, 2020, 207
  • [24] Feature Selection Algorithm Based on Label Correlation
    Lü Y.
    Li D.
    Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2020, 33 (08): : 716 - 723
  • [25] Feature selection via kernel sparse representation
    Lv, Zhizheng
    Li, Yangding
    Li, Jieye
    2019 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2019), 2019, : 2637 - 2644
  • [26] Sparse multi-label feature selection via dynamic graph manifold regularization
    Zhang, Yao
    Ma, Yingcang
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (03) : 1021 - 1036
  • [27] Sparse multi-label feature selection via dynamic graph manifold regularization
    Yao Zhang
    Yingcang Ma
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 1021 - 1036
  • [28] Structured learning for unsupervised feature selection with high-order matrix factorization
    Wang, Shiping
    Chen, Jiawei
    Guo, Wenzhong
    Liu, Genggeng
    EXPERT SYSTEMS WITH APPLICATIONS, 2020, 140
  • [29] Feature Correlation Hypergraph: Exploiting High-order Potentials for Multimodal Recognition
    Zhang, Luming
    Gao, Yue
    Hong, Chaoqun
    Feng, Yinfu
    Zhu, Jianke
    Cai, Deng
    IEEE TRANSACTIONS ON CYBERNETICS, 2014, 44 (08) : 1408 - 1419
  • [30] Unsupervised feature selection via joint local learning and group sparse regression
    Yue WU
    Can WANG
    Yue-qing ZHANG
    Jia-jun BU
    Frontiers of Information Technology & Electronic Engineering, 2019, 20 (04) : 538 - 553