Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

被引:0
|
作者
Axiotis, Kyriakos [1 ]
Sviridenko, Maxim [2 ]
机构
[1] MIT, Comp Sci & Artificial Intelligence Lab CSAIL, Cambridge, MA 02139 USA
[2] Yahoo Res, 770 Broadway, New York, NY 10003 USA
关键词
sparse optimization; convex optimization; compressed sensing; iterative hard thresholding; orthogonal matching pursuit; convex regularization; SIGNAL RECOVERY; BOUNDS; SELECTION; PURSUIT;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The goal of Sparse Convex Optimization is to optimize a convex function f under a sparsity constraint s <= s* gamma, where s* is the target number of non-zero entries in a feasible solution (sparsity) and gamma >= 1 is an approximation factor. There has been a lot of work to analyze the sparsity guarantees of various algorithms (LASSO, Orthogonal Matching Pursuit (OMP), Iterative Hard Thresholding (IHT)) in terms of the Restricted Condition Number kappa. The best known algorithms guarantee to find an approximate solution of value f(x*) + epsilon with the sparsity bound of gamma = O (kappa min {log f (x(0))- f (x*)/epsilon, kappa}), where x* is the target solution. We present a new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to gamma = O(K), which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT. This is achieved without significant sacrifice in the runtime efficiency compared to the fastest known algorithms. We also provide a new analysis of OMP with Replacement (OMPR) for general f, under the condition s > s* kappa(2)/4, which yields compressed sensing bounds under the Restricted Isometry Property (RIP). When compared to other compressed sensing approaches, it has the advantage of providing a strong tradeoff between the RIP condition and the solution sparsity, while working for any general function f that meets the RIP condition.
引用
收藏
页码:1 / 47
页数:47
相关论文
共 50 条
  • [1] Sparse convex optimization via adaptively regularized hard thresholding
    Axiotis, Kyriakos
    Sviridenko, Maxim
    Journal of Machine Learning Research, 2021, 22
  • [2] Sparse Convex Optimization via Adaptively Regularized Hard Thresholding
    Axiotis, Kyriakos
    Sviridenko, Maxim
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [3] Constrained Mix Sparse Optimization via Hard Thresholding Pursuit
    Hu, Xinlin
    Hu, Yaohua
    Yang, Xiaoqi
    Zhang, Kai
    JOURNAL OF SCIENTIFIC COMPUTING, 2024, 101 (03)
  • [4] Iterative hard thresholding methods for regularized convex cone programming
    Lu, Zhaosong
    MATHEMATICAL PROGRAMMING, 2014, 147 (1-2) : 125 - 154
  • [5] Sparse solution of underdetermined linear equations via adaptively iterative thresholding
    Zeng, Jinshan
    Lin, Shaobo
    Xu, Zongben
    SIGNAL PROCESSING, 2014, 97 : 152 - 161
  • [6] Sparse Signal Reconstruction via ECME Hard Thresholding
    Qiu, Kun
    Dogandzic, Aleksandar
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (09) : 4551 - 4569
  • [7] SPARSE PCA VIA HARD THRESHOLDING FOR BLIND SOURCE SEPARATION
    Wu, Ming-Chun
    Chen, Kwang-Cheng
    2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 2539 - 2543
  • [8] Structured Sparse Regression via Greedy Hard-thresholding
    Jain, Prateek
    Rao, Nikhil
    Dhillon, Inderjit
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [9] Sparse recovery using expanders via hard thresholding algorithm
    Wen, Kun-Kai
    He, Jia-Xin
    Li, Peng
    SIGNAL PROCESSING, 2025, 227
  • [10] Recovery of Sparse Signals via Modified Hard Thresholding Pursuit Algorithms
    Geng, Li-Ping
    Zhou, Jin-Chuan
    Sun, Zhong-Feng
    Tang, Jing-Yong
    IET SIGNAL PROCESSING, 2023, 2023