Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

被引:0
|
作者
Axiotis, Kyriakos [1 ]
Sviridenko, Maxim [2 ]
机构
[1] MIT, Comp Sci & Artificial Intelligence Lab CSAIL, Cambridge, MA 02139 USA
[2] Yahoo Res, 770 Broadway, New York, NY 10003 USA
关键词
sparse optimization; convex optimization; compressed sensing; iterative hard thresholding; orthogonal matching pursuit; convex regularization; SIGNAL RECOVERY; BOUNDS; SELECTION; PURSUIT;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The goal of Sparse Convex Optimization is to optimize a convex function f under a sparsity constraint s <= s* gamma, where s* is the target number of non-zero entries in a feasible solution (sparsity) and gamma >= 1 is an approximation factor. There has been a lot of work to analyze the sparsity guarantees of various algorithms (LASSO, Orthogonal Matching Pursuit (OMP), Iterative Hard Thresholding (IHT)) in terms of the Restricted Condition Number kappa. The best known algorithms guarantee to find an approximate solution of value f(x*) + epsilon with the sparsity bound of gamma = O (kappa min {log f (x(0))- f (x*)/epsilon, kappa}), where x* is the target solution. We present a new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to gamma = O(K), which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT. This is achieved without significant sacrifice in the runtime efficiency compared to the fastest known algorithms. We also provide a new analysis of OMP with Replacement (OMPR) for general f, under the condition s > s* kappa(2)/4, which yields compressed sensing bounds under the Restricted Isometry Property (RIP). When compared to other compressed sensing approaches, it has the advantage of providing a strong tradeoff between the RIP condition and the solution sparsity, while working for any general function f that meets the RIP condition.
引用
收藏
页码:1 / 47
页数:47
相关论文
共 50 条
  • [31] Fast Sparse Recovery via Non-Convex Optimization
    Chen, Laming
    Gu, Yuantao
    2015 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP), 2015, : 1275 - 1279
  • [32] Robust Sparse Recovery via Non-Convex Optimization
    Chen, Laming
    Gu, Yuantao
    2014 19TH INTERNATIONAL CONFERENCE ON DIGITAL SIGNAL PROCESSING (DSP), 2014, : 742 - 747
  • [33] GRAPH-STRUCTURED SPARSE REGULARIZATION VIA CONVEX OPTIMIZATION
    Kuroda, Hiroki
    Kitahara, Daichi
    2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 5538 - 5542
  • [34] Applying Sparse Array in Massive MIMO via Convex Optimization
    Lou, Mengting
    Jin, Jing
    Wang, Hanning
    Xia, Liang
    Wang, Qixing
    Yuan, Yifei
    2020 IEEE ASIA-PACIFIC MICROWAVE CONFERENCE (APMC), 2020, : 721 - 723
  • [35] Regularized K-means Through Hard-Thresholding
    Raymaekers, Jakob
    Zamar, Ruben H.
    JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [36] Sparse PCA via Covariance Thresholding
    Deshpande, Yash
    Montanari, Andrea
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [37] Non-convex sparse regularization via convex optimization for blade tip timing
    Zhou, Kai
    Wang, Yanan
    Qiao, Baijie
    Liu, Junjiang
    Liu, Meiru
    Yang, Zhibo
    Chen, Xuefeng
    MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2025, 222
  • [38] Non-convex sparse regularization via convex optimization for impact force identification
    Liu, Junjiang
    Qiao, Baijie
    Wang, Yanan
    He, Weifeng
    Chen, Xuefeng
    MECHANICAL SYSTEMS AND SIGNAL PROCESSING, 2023, 191
  • [39] Sparse PCA via Covariance Thresholding
    Deshpande, Yash
    Montanari, Andrea
    JOURNAL OF MACHINE LEARNING RESEARCH, 2016, 17
  • [40] Sparse approximation via iterative thresholding
    Herrity, Kyle K.
    Gilbert, Anna C.
    Tropp, Joel A.
    2006 IEEE International Conference on Acoustics, Speech and Signal Processing, Vols 1-13, 2006, : 3075 - 3078