Robust Feature Selection Method Based on Joint L2,1 Norm Minimization for Sparse Regression

被引:4
|
作者
Yang, Libo [1 ]
Zhu, Dawei [1 ]
Liu, Xuemei [1 ]
Cui, Pei [2 ]
机构
[1] North China Univ Water Resources & Elect Power, Sch Informat Engn, Zhengzhou 450046, Peoples R China
[2] Yellow River Water & Hydropower Dev Grp Co Ltd, Zhengzhou 450018, Peoples R China
关键词
feature selection; regression analysis; sparse projection; minimization; supervised learning; ILLUMINATION;
D O I
10.3390/electronics12214450
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Feature selection methods are widely used in machine learning tasks to reduce the dimensionality and improve the performance of the models. However, traditional feature selection methods based on regression often suffer from a lack of robustness and generalization ability and are easily affected by outliers in the data. To address this problem, we propose a robust feature selection method based on sparse regression. This method uses a non-square form of the L2,1 norm as both the loss function and regularization term, which can effectively enhance the model's resistance to outliers and achieve feature selection simultaneously. Furthermore, to improve the model's robustness and prevent overfitting, we add an elastic variable to the loss function. We design two efficient convergent iterative processes to solve the optimization problem based on the L2,1 norm and propose a robust joint sparse regression algorithm. Extensive experimental results on three public datasets show that our feature selection method outperforms other comparison methods.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Robust Feature Selection via Simultaneous Capped l2 -Norm and l2,1 -Norm Minimization
    Lan, Gongmin
    Hou, Chenping
    Yi, Dongyun
    PROCEEDINGS OF 2016 IEEE INTERNATIONAL CONFERENCE ON BIG DATA ANALYSIS (ICBDA), 2016, : 147 - 151
  • [2] Robust discriminant feature selection via joint L2,1-norm distance minimization and maximization
    Yang, Zhangjing
    Ye, Qiaolin
    Chen, Qiao
    Ma, Xu
    Fu, Liyong
    Yang, Guowei
    Yan, He
    Liu, Fan
    KNOWLEDGE-BASED SYSTEMS, 2020, 207
  • [3] l2,1-norm minimization based negative label relaxation linear regression for feature selection
    Peng, Yali
    Sehdev, Paramjit
    Liu, Shigang
    Lie, Jun
    Wang, Xili
    PATTERN RECOGNITION LETTERS, 2018, 116 : 170 - 178
  • [4] Robust feature selection via l2,1-norm in finite mixture of regression
    Li, Xiangrui
    Zhu, Dongxiao
    PATTERN RECOGNITION LETTERS, 2018, 108 : 15 - 22
  • [5] Robust Sparse Hyperspectral Unmixing With l2,1 Norm
    Ma, Yong
    Li, Chang
    Mei, Xiaoguang
    Liu, Chengyin
    Ma, Jiayi
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2017, 55 (03): : 1227 - 1239
  • [6] Sparse feature selection based on L2,1/2-matrix norm for web image annotation
    Shi, Caijuan
    Ruan, Qiuqi
    Guo, Song
    Tian, Yi
    NEUROCOMPUTING, 2015, 151 : 424 - 433
  • [7] Unsupervised maximum margin feature selection via L2,1-norm minimization
    Shizhun Yang
    Chenping Hou
    Feiping Nie
    Yi Wu
    Neural Computing and Applications, 2012, 21 : 1791 - 1799
  • [8] Face recognition by sparse discriminant analysis via joint L2,1-norm minimization
    Shi, Xiaoshuang
    Yang, Yujiu
    Guo, Zhenhua
    Lai, Zhihui
    PATTERN RECOGNITION, 2014, 47 (07) : 2447 - 2453
  • [9] Hessian Semi-Supervised Sparse Feature Selection Based on L2,1/2-Matrix Norm
    Shi, Caijuan
    Ruan, Qiuiqi
    An, Gaoyun
    Zhao, Ruizhen
    IEEE TRANSACTIONS ON MULTIMEDIA, 2015, 17 (01) : 16 - 28
  • [10] l2,1 Regularized Correntropy for Robust Feature Selection
    He, Ran
    Tan, Tieniu
    Wang, Liang
    Zheng, Wei-Shi
    2012 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2012, : 2504 - 2511