Semi-Supervised Feature Selection via Sparse Rescaled Linear Square Regression

被引:71
|
作者
Chen, Xiaojun [1 ]
Yuan, Guowen [1 ]
Nie, Feiping [2 ,3 ]
Ming, Zhong [1 ]
机构
[1] Shenzhen Univ, Coll Comp Sci & Software, Shenzhen 518060, Guangdong, Peoples R China
[2] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Shanxi, Peoples R China
[3] Northwestern Polytech Univ, Ctr Opt IMagery Anal & Learning OPTIMAL, Xian 710072, Shanxi, Peoples R China
关键词
Feature extraction; Computational complexity; Laplace equations; Knowledge discovery; Data engineering; Iterative methods; Adaptation models; Feature selection; semi-supervised feature selection; sparse feature selection; least square regression; CLASSIFICATION;
D O I
10.1109/TKDE.2018.2879797
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
With the rapid increase of the data size, it has increasing demands for selecting features by exploiting both labeled and unlabeled data. In this paper, we propose a novel semi-supervised embedded feature selection method. The new method extends the least square regression model by rescaling the regression coefficients in the least square regression with a set of scale factors, which is used for evaluating the importance of features. An iterative algorithm is proposed to optimize the new model. It has been proved that solving the new model is equivalent to solving a sparse model with a flexible and adaptable l(2,p) norm regularization. Moreover, the optimal solution of scale factors provides a theoretical explanation for why we can use {parallel to w(1)parallel to(2), ..., parallel to w(d)parallel to(2)} to evaluate the importance of features. Experimental results on eight benchmark data sets show the superior performance of the proposed method.
引用
收藏
页码:165 / 176
页数:12
相关论文
共 50 条
  • [31] Locality sensitive semi-supervised feature selection
    Zhao, Jidong
    Lu, Ke
    He, Xiaofei
    [J]. NEUROCOMPUTING, 2008, 71 (10-12) : 1842 - 1849
  • [32] Semi-supervised relevance index for feature selection
    Coelho, Frederico
    Castro, Cristiano
    Braga, Antonio P.
    Verleysen, Michel
    [J]. NEURAL COMPUTING & APPLICATIONS, 2019, 31 (Suppl 2): : 989 - 997
  • [33] Simple strategies for semi-supervised feature selection
    Sechidis, Konstantinos
    Brown, Gavin
    [J]. MACHINE LEARNING, 2018, 107 (02) : 357 - 395
  • [34] Semi-supervised sparse feature selection based on multi-view Laplacian regularization
    Shi, Caijuan
    Ruan, Qiuqi
    An, Gaoyun
    Ge, Chao
    [J]. IMAGE AND VISION COMPUTING, 2015, 41 : 1 - 10
  • [35] Semi-supervised feature selection analysis with structured multi-view sparse regularization
    Shi, Caijuan
    Duan, Changyu
    Gu, Zhibin
    Tian, Qi
    An, Gaoyun
    Zhao, Ruizhen
    [J]. NEUROCOMPUTING, 2019, 330 : 412 - 424
  • [36] Sparse semi-supervised multi-label feature selection based on latent representation
    Zhao, Xue
    Li, Qiaoyan
    Xing, Zhiwei
    Yang, Xiaofei
    Dai, Xuezhen
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (04) : 5139 - 5151
  • [37] Efficient Semi-Supervised Learning and Sparse Structural Learning for Feature Selection of Leukemia Dataset
    Roopa, S. Nithya
    Nagarajan, N.
    [J]. JOURNAL OF MEDICAL IMAGING AND HEALTH INFORMATICS, 2020, 10 (08) : 1815 - 1824
  • [38] Semi-supervised Regression with Data Partitioning and Feature Mapping
    Liu, Li-Yan
    Zhang, Jia-Hui
    Min, Fan
    [J]. 2022 IEEE 9TH INTERNATIONAL CONFERENCE ON DATA SCIENCE AND ADVANCED ANALYTICS (DSAA), 2022, : 76 - 85
  • [39] A recursive feature retention method for semi-supervised feature selection
    Qingqing Pang
    Li Zhang
    [J]. International Journal of Machine Learning and Cybernetics, 2021, 12 : 2639 - 2657
  • [40] A recursive feature retention method for semi-supervised feature selection
    Pang, Qingqing
    Zhang, Li
    [J]. INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2021, 12 (09) : 2639 - 2657