Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

被引:0
|
作者
Axiotis, Kyriakos [1 ]
Sviridenko, Maxim [2 ]
机构
[1] MIT, Comp Sci & Artificial Intelligence Lab CSAIL, Cambridge, MA 02139 USA
[2] Yahoo Res, 770 Broadway, New York, NY 10003 USA
关键词
sparse optimization; convex optimization; compressed sensing; iterative hard thresholding; orthogonal matching pursuit; convex regularization; SIGNAL RECOVERY; BOUNDS; SELECTION; PURSUIT;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The goal of Sparse Convex Optimization is to optimize a convex function f under a sparsity constraint s <= s* gamma, where s* is the target number of non-zero entries in a feasible solution (sparsity) and gamma >= 1 is an approximation factor. There has been a lot of work to analyze the sparsity guarantees of various algorithms (LASSO, Orthogonal Matching Pursuit (OMP), Iterative Hard Thresholding (IHT)) in terms of the Restricted Condition Number kappa. The best known algorithms guarantee to find an approximate solution of value f(x*) + epsilon with the sparsity bound of gamma = O (kappa min {log f (x(0))- f (x*)/epsilon, kappa}), where x* is the target solution. We present a new Adaptively Regularized Hard Thresholding (ARHT) algorithm that makes significant progress on this problem by bringing the bound down to gamma = O(K), which has been shown to be tight for a general class of algorithms including LASSO, OMP, and IHT. This is achieved without significant sacrifice in the runtime efficiency compared to the fastest known algorithms. We also provide a new analysis of OMP with Replacement (OMPR) for general f, under the condition s > s* kappa(2)/4, which yields compressed sensing bounds under the Restricted Isometry Property (RIP). When compared to other compressed sensing approaches, it has the advantage of providing a strong tradeoff between the RIP condition and the solution sparsity, while working for any general function f that meets the RIP condition.
引用
收藏
页码:1 / 47
页数:47
相关论文
共 50 条
  • [21] Block-Sparse Recovery via Convex Optimization
    Elhamifar, Ehsan
    Vidal, Rene
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (08) : 4094 - 4107
  • [22] Accelerate Randomized Coordinate Descent Iterative Hard Thresholding Methods for l0 Regularized Convex Problems
    Ding, Liyong
    Song, Enbin
    Zhu, Yunmin
    PROCEEDINGS OF THE 35TH CHINESE CONTROL CONFERENCE 2016, 2016, : 2816 - 2819
  • [23] Randomized Iterative Hard Thresholding for Sparse Approximations
    Crandall, Robert
    Dong, Bin
    Bilgin, Ali
    2014 DATA COMPRESSION CONFERENCE (DCC 2014), 2014, : 403 - 403
  • [24] Accelerated Hard Thresholding Algorithms for Sparse Recovery
    Zhao, Xueci
    Du, Peibing
    Sun, Tao
    Cheng, Lizhi
    PROCEEDINGS OF THE 2017 5TH INTERNATIONAL CONFERENCE ON FRONTIERS OF MANUFACTURING SCIENCE AND MEASURING TECHNOLOGY (FMSMT 2017), 2017, 130 : 1322 - 1328
  • [25] Sparse Adaptive Filtering by Iterative Hard Thresholding
    Das, Rajib Lochan
    Chakraborty, Mrityunjoy
    2013 ASIA-PACIFIC SIGNAL AND INFORMATION PROCESSING ASSOCIATION ANNUAL SUMMIT AND CONFERENCE (APSIPA), 2013,
  • [26] On Accelerated Hard Thresholding Methods for Sparse Approximation
    Cevher, Volkan
    WAVELETS AND SPARSITY XIV, 2011, 8138
  • [27] Optimization of Sparse Cross Array Synthesis via Perturbed Convex Optimization
    Gu, Boxuan
    Chen, Yaowu
    Jiang, Rongxin
    Liu, Xuesong
    SENSORS, 2020, 20 (17) : 1 - 17
  • [28] Sparse Signal Reconstruction from Quantized Noisy Measurements via GEM Hard Thresholding
    Qiu, Kun
    Dogandzic, Aleksandar
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2012, 60 (05) : 2628 - 2634
  • [29] LARGE-SCALE REGULARIZED PORTFOLIO SELECTION VIA CONVEX OPTIMIZATION
    Zhao, Ziping
    Palomar, Daniel P.
    2019 7TH IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (IEEE GLOBALSIP), 2019,
  • [30] Hard thresholding pursuit with continuation for 0-regularized minimizations
    Sun, Tao
    Jiang, Hao
    Cheng, Lizhi
    MATHEMATICAL METHODS IN THE APPLIED SCIENCES, 2018, 41 (16) : 6195 - 6209