FAST GLOBAL CONVERGENCE OF GRADIENT METHODS FOR SOLVING REGULARIZED M-ESTIMATION

被引:0
|
作者
Agarwal, Alekh [1 ]
Negahban, Sahand [2 ]
Wainwright, Martin J. [3 ]
机构
[1] Univ Calif Berkeley, Dept EECS, Berkeley, CA 94720 USA
[2] MIT, Cambridge, MA 02139 USA
[3] Univ Calif Berkeley, Dept EECS & Stat, Berkeley, CA 94720 USA
来源
2012 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP (SSP) | 2012年
关键词
High-dimensional statistics; convex optimization; gradient methods; SELECTION;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
We analyze the convergence rates of composite gradient methods for solving problems based on regularized M-estimators, working within a high-dimensional framework that allows the data dimension d to grow with (and possibly exceed) the sample size n. This high-dimensional structure precludes the usual global assumptions-namely, strong convexity and smoothness conditions-that underlie much of classical optimization analysis. We define appropriately restricted versions of these conditions, and show that they are satisfied with high probability for various statistical models. Under these conditions, our theory guarantees that composite gradient descent has a globally geometric rate of convergence up to the statistical precision of the model, meaning the typical distance between the true unknown parameter theta* and an optimal solution (theta) over cap. This result is substantially sharper than previous results, which yielded sublinear convergence or linear convergence up to the noise level, and builds on our earlier work for constrained estimation problems. Our analysis applies to a wide range of M-estimators and statistical models, including sparse linear regression using Lasso (l(1)-regularized regression); group Lasso for block sparsity; log-linear models with regularization; low-rank matrix recovery using nuclear norm regularization; and matrix decomposition. Overall, our analysis reveals interesting connections between statistical precision and computational efficiency in high-dimensional estimation.
引用
收藏
页码:409 / 412
页数:4
相关论文
共 50 条
  • [1] Complexity penalized M-estimation: Fast computation
    Friedrich, F.
    Kempe, A.
    Liebscher, V.
    Winkler, G.
    JOURNAL OF COMPUTATIONAL AND GRAPHICAL STATISTICS, 2008, 17 (01) : 201 - 224
  • [2] Weak Convergence of the Regularization Path in Penalized M-Estimation
    Germain, Jean-Francois
    Roueff, Francois
    SCANDINAVIAN JOURNAL OF STATISTICS, 2010, 37 (03) : 477 - 495
  • [3] Fast Global Convergence of Natural Policy Gradient Methods with Entropy Regularization
    Cen, Shicong
    Cheng, Chen
    Chen, Yuxin
    Wei, Yuting
    Chi, Yuejie
    OPERATIONS RESEARCH, 2021, 70 (04) : 2563 - 2578
  • [4] FAST GLOBAL CONVERGENCE OF GRADIENT METHODS FOR HIGH-DIMENSIONAL STATISTICAL RECOVERY
    Agarwal, Alekh
    Negahban, Sahand
    Wainwright, Martin J.
    ANNALS OF STATISTICS, 2012, 40 (05): : 2452 - 2482
  • [5] Fast gradient methods based on global motion estimation for video compression
    Keller, Y
    Averbuch, A
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2003, 13 (04) : 300 - 309
  • [6] Fast gradient methods based global motion estimation for video compression
    Keller, Y
    Averbuch, A
    Miller, O
    2002 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOL I, PROCEEDINGS, 2002, : 665 - 668
  • [7] Path following algorithms for l2-regularized M-estimation with approximation guarantee
    Zhu, Yunzhang
    Liu, Renxiong
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [8] OPTIMAL SURVEY SCHEMES FOR STOCHASTIC GRADIENT DESCENT WITH APPLICATIONS TO M-ESTIMATION
    Clemencon, Stephan
    Bertail, Patrice
    Chautru, Emilie
    Papa, Guillaume
    ESAIM-PROBABILITY AND STATISTICS, 2019, 23 : 310 - 337
  • [9] Kalman Filter Outlier Detection Methods Based on M-estimation
    Liu Zhe
    Wang Junfeng
    Wu Yu
    Xiong Lijun
    Qian Kechang
    2013 32ND CHINESE CONTROL CONFERENCE (CCC), 2013, : 4652 - 4655
  • [10] CONVERGENCE RATE ESTIMATES FOR REGULARIZED METHODS OF SOLVING VARIATIONAL-PROBLEMS
    KAPLAN, AA
    SOVIET JOURNAL OF NUMERICAL ANALYSIS AND MATHEMATICAL MODELLING, 1989, 4 (02): : 99 - 109