Robust and Sparse Principal Component Analysis With Adaptive Loss Minimization for Feature Selection

被引:12
|
作者
Bian, Jintang [1 ,2 ]
Zhao, Dandan [3 ]
Nie, Feiping [1 ,4 ]
Wang, Rong [4 ]
Li, Xuelong [1 ,4 ]
机构
[1] Northwestern Polytech Univ, Sch Comp Sci, Xian 710072, Peoples R China
[2] Northwestern Polytech Univ, Ctr Opt Imagery Anal & Learning, Xian 710072, Peoples R China
[3] Chongqing Univ, Coll Automat, Chongqing 400044, Peoples R China
[4] Northwestern Polytech Univ, Sch Artificial Intelligence Opt & Elect iOPEN, Xian 710072, Peoples R China
基金
中国国家自然科学基金;
关键词
Adaptive loss minimization; robust principal component analysis (RPCA); structural sparsity constraint; unsupervised feature selection; UNSUPERVISED FEATURE-SELECTION; SUPERVISED FEATURE-SELECTION;
D O I
10.1109/TNNLS.2022.3194896
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Principal component analysis (PCA) is one of the most successful unsupervised subspace learning methods and has been used in many practical applications. To deal with the outliers in real-world data, robust principal analysis models based on various measure are proposed. However, conventional PCA models can only transform features to unknown subspace for dimensionality reduction and cannot perform features' selection task. In this article, we propose a novel robust PCA (RPCA) model to mitigate the impact of outliers and conduct feature selection, simultaneously. First, we adopt sigma-norm as reconstruction error (RE), which plays an important role in robust reconstruction. Second, to conduct feature selection task, we apply l(2,0)-norm constraint to subspace projection. Furthermore, an efficient iterative optimization algorithm is proposed to solve the objective function with nonconvex and nonsmooth constraint. Extensive experiments conducted on several real-world datasets demonstrate the effectiveness and superiority of the proposed feature selection model.
引用
收藏
页码:3601 / 3614
页数:14
相关论文
共 50 条
  • [1] Adaptive Weighted Sparse Principal Component Analysis for Robust Unsupervised Feature Selection
    Yi, Shuangyan
    He, Zhenyu
    Jing, Xiao-Yuan
    Li, Yi
    Cheung, Yiu-Ming
    Nie, Feiping
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (06) : 2153 - 2163
  • [2] Clustering and feature selection using sparse principal component analysis
    Ronny Luss
    Alexandre d’Aspremont
    [J]. Optimization and Engineering, 2010, 11 : 145 - 157
  • [3] Clustering and feature selection using sparse principal component analysis
    Luss, Ronny
    d'Aspremont, Alexandre
    [J]. OPTIMIZATION AND ENGINEERING, 2010, 11 (01) : 145 - 157
  • [4] Feature selection for text data via sparse principal component analysis
    Son, Won
    [J]. KOREAN JOURNAL OF APPLIED STATISTICS, 2023, 36 (06) : 501 - 514
  • [5] Robust sparse principal component analysis
    ZHAO Qian
    MENG DeYu
    XU ZongBen
    [J]. Science China(Information Sciences), 2014, 57 (09) : 175 - 188
  • [6] Robust sparse principal component analysis
    Zhao Qian
    Meng DeYu
    Xu ZongBen
    [J]. SCIENCE CHINA-INFORMATION SCIENCES, 2014, 57 (09) : 1 - 14
  • [7] Robust Sparse Principal Component Analysis
    Croux, Christophe
    Filzmoser, Peter
    Fritz, Heinrich
    [J]. TECHNOMETRICS, 2013, 55 (02) : 202 - 214
  • [8] Robust sparse principal component analysis
    Qian Zhao
    DeYu Meng
    ZongBen Xu
    [J]. Science China Information Sciences, 2014, 57 : 1 - 14
  • [9] Robust principal component analysis with adaptive selection for tuning parameters
    Higuchi, I
    Eguchi, S
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2004, 5 : 453 - 471
  • [10] Adaptive robust principal component analysis
    Liu, Yang
    Gao, Xinbo
    Gao, Quanxue
    Shao, Ling
    Han, Jungong
    [J]. NEURAL NETWORKS, 2019, 119 : 85 - 92