Efficient Sparse Recovery via Adaptive Non-Convex Regularizers with Oracle Property

被引:0
|
作者
Lin, Ming [1 ]
Jin, Rong [2 ]
Zhang, Changshui [1 ]
机构
[1] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
[2] Michigan State Univ, Comp Sci & Engn, E Lansing, MI 48823 USA
关键词
HIGH-DIMENSIONAL REGRESSION; VARIABLE SELECTION; CONVERGENCE; SHRINKAGE; ALGORITHM; LASSO;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The main shortcoming of sparse recovery with a convex regularizer is that it is a biased estimator and therefore will result in a suboptimal performance in many cases. Recent studies have shown, both theoretically and empirically, that non-convex regularizer is able to overcome the biased estimation problem. Although multiple algorithms have been developed for sparse recovery with non-convex regularization, they are either computationally demanding or not equipped with the desired properties (i.e. optimal recovery error, selection consistency and oracle property). In this work, we develop an algorithm for efficient sparse recovery based on proximal gradient descent. The key feature of the proposed algorithm is introducing adaptive non-convex regularizers whose shrinking threshold vary over iterations. The algorithm is compatible with most popular non-convex regularizers, achieves a geometric convergence rate for the recovery error, is selection consistent, and most importantly has the oracle property. Based on the proposed framework, we suggest to use a so-called ACCQ regularizer, which is equivalent to zero proximal projection gap adaptive hard-thresholding. Experiments with both synthetic data sets and real images verify both the efficiency and effectiveness of the proposed method compared to the state-of-the-art methods for sparse recovery.
引用
收藏
页码:505 / 514
页数:10
相关论文
共 50 条
  • [31] Adaptive Hammerstein Filtering via Recursive Non-Convex Projection
    Liu, Zhaoting
    Li, Chunguang
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 2869 - 2882
  • [32] A New Sufficient Condition for Non-Convex Sparse Recovery via Weighted lr-l1 Minimization
    Huang, Jianwen
    Zhang, Feng
    Jia, Jinping
    IEEE SIGNAL PROCESSING LETTERS, 2022, 29 : 1555 - 1558
  • [33] Group sparsity extension of "Non-convex sparse regularization via convex optimization for impact force
    Liu, Junjiang
    Qiao, Baijie
    Wang, Yanan
    He, Weifeng
    Chen, Xuefeng
    MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2023, 201
  • [34] NON-CONVEX SUPER-RESOLUTION OF OCT IMAGES VIA SPARSE REPRESENTATION
    Scrivanti, Gabriele
    Calatroni, Luca
    Morigi, Serena
    Nicholson, Lindsay
    Achim, Alin
    2021 IEEE 18TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING (ISBI), 2021, : 621 - 624
  • [35] Kernel group sparse representation classifier via structural and non-convex constraints
    Zheng, Jianwei
    Qiu, Hong
    Sheng, Weiguo
    Yang, Xi
    Yu, Hongchuan
    NEUROCOMPUTING, 2018, 296 : 1 - 11
  • [36] THE CONVERGENCE GUARANTEES OF A NON-CONVEX APPROACH FOR SPARSE RECOVERY USING REGULARIZED LEAST SQUARES
    Chen, Laming
    Gu, Yuantao
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [37] Robust adaptive beamforming based on sparse reconstruction using a non-convex optimisation algorithm
    Chen, Pei
    Zhao, Yongjun
    Liu, Chengcheng
    ELECTRONICS LETTERS, 2016, 52 (19) : 1584 - 1586
  • [38] Matrix Completion via Non-Convex Relaxation and Adaptive Correlation Learning
    Li, Xuelong
    Zhang, Hongyuan
    Zhang, Rui
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2023, 45 (02) : 1981 - 1991
  • [39] Natasha: Faster Non-Convex Stochastic Optimization via Strongly Non-Convex Parameter
    Allen-Zhu, Zeyuan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [40] Distributed Quantile Regression with Non-Convex Sparse Penalties
    Mirzaeifard, Reza
    Gogineni, Vinay Chakravarthi
    Venkategowda, Naveen K. D.
    Werner, Stefan
    2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 250 - 254