Efficient Sparse Recovery via Adaptive Non-Convex Regularizers with Oracle Property

被引:0
|
作者
Lin, Ming [1 ]
Jin, Rong [2 ]
Zhang, Changshui [1 ]
机构
[1] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
[2] Michigan State Univ, Comp Sci & Engn, E Lansing, MI 48823 USA
关键词
HIGH-DIMENSIONAL REGRESSION; VARIABLE SELECTION; CONVERGENCE; SHRINKAGE; ALGORITHM; LASSO;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The main shortcoming of sparse recovery with a convex regularizer is that it is a biased estimator and therefore will result in a suboptimal performance in many cases. Recent studies have shown, both theoretically and empirically, that non-convex regularizer is able to overcome the biased estimation problem. Although multiple algorithms have been developed for sparse recovery with non-convex regularization, they are either computationally demanding or not equipped with the desired properties (i.e. optimal recovery error, selection consistency and oracle property). In this work, we develop an algorithm for efficient sparse recovery based on proximal gradient descent. The key feature of the proposed algorithm is introducing adaptive non-convex regularizers whose shrinking threshold vary over iterations. The algorithm is compatible with most popular non-convex regularizers, achieves a geometric convergence rate for the recovery error, is selection consistent, and most importantly has the oracle property. Based on the proposed framework, we suggest to use a so-called ACCQ regularizer, which is equivalent to zero proximal projection gap adaptive hard-thresholding. Experiments with both synthetic data sets and real images verify both the efficiency and effectiveness of the proposed method compared to the state-of-the-art methods for sparse recovery.
引用
收藏
页码:505 / 514
页数:10
相关论文
共 50 条
  • [1] Screening Rules for Lasso with Non-Convex Sparse Regularizers
    Rakotomamonjy, Alain
    Gasso, Gilles
    Salmon, Joseph
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [2] Fast Sparse Recovery via Non-Convex Optimization
    Chen, Laming
    Gu, Yuantao
    2015 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2015, : 1275 - 1279
  • [3] Robust Sparse Recovery via Non-Convex Optimization
    Chen, Laming
    Gu, Yuantao
    2014 19TH INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2014, : 742 - 747
  • [4] Grouped Sparse Signal Reconstruction Using Non-convex Regularizers
    Samarasinghe, Kasun M.
    Fan, H. Howard
    2014 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2014, : 502 - 506
  • [5] Convergent Working Set Algorithm for Lasso with Non-Convex Sparse Regularizers
    Rakotomamonjy, Alain
    Flamary, Remi
    Gasso, Gilles
    Salmon, Joseph
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 151, 2022, 151
  • [6] An Overloaded IoT Signal Detection Method using Non-convex Sparse Regularizers
    Hayashi, Kazunori
    Nakai-Kasai, Ayano
    Hirayama, Atsuya
    Honda, Hiroki
    Sasaki, Tetsuya
    Yasukawa, Hideki
    Hayakawa, Ryo
    2020 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA ASC), 2020, : 1490 - 1496
  • [7] Sparse signal recovery via non-convex optimization and overcomplete dictionaries
    Huang, Wei
    Liu, Lu
    Yang, Zhuo
    Zhao, Yao
    INTERNATIONAL JOURNAL OF WAVELETS MULTIRESOLUTION AND INFORMATION PROCESSING, 2018, 16 (06)
  • [8] Spiking Sparse Recovery With Non-Convex Penalties
    Zhang, Xiang
    Yu, Lei
    Zheng, Gang
    Eldar, Yonina C. C.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 6272 - 6285
  • [9] AN EFFICIENT ALGORITHM FOR NON-CONVEX SPARSE OPTIMIZATION
    Wang, Yong
    Liu, Wanquan
    Zhou, Guanglu
    JOURNAL OF INDUSTRIAL AND MANAGEMENT OPTIMIZATION, 2019, 15 (04) : 2009 - 2021
  • [10] Total Variation Denoising With Non-Convex Regularizers
    Zou, Jian
    Shen, Marui
    Zhang, Ya
    Li, Haifeng
    Liu, Guoqi
    Ding, Shuxue
    IEEE ACCESS, 2019, 7 : 4422 - 4431