On Asymptotic Linear Convergence of Projected Gradient Descent for Constrained Least Squares

被引:8
|
作者
Vu, Trung [1 ]
Raich, Raviv [1 ]
机构
[1] Oregon State Univ, Sch Elect Engn & Comp Sci, Corvallis, OR 97331 USA
关键词
Convergence; Optimization; Manifolds; Signal processing algorithms; Machine learning; Symmetric matrices; Sparse matrices; Asymptotic convergence rate; constrained least squares; local linear convergence; projected gradient descent; MATRIX; ALGORITHM; RESTORATION; MACHINE; MODEL;
D O I
10.1109/TSP.2022.3192142
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Many recent problems in signal processing and machine learning such as compressed sensing, image restoration, matrix/tensor recovery, and non-negative matrix factorization can be cast as constrained optimization. Projected gradient descent is a simple yet efficient method for solving such constrained optimization problems. Local convergence analysis furthers our understanding of its asymptotic behavior near the solution, offering sharper bounds on the convergence rate compared to global convergence analysis. However, local guarantees often appear scattered in problem-specific areas of machine learning and signal processing. This manuscript presents a unified framework for the local convergence analysis of projected gradient descent in the context of constrained least squares. The proposed analysis offers insights into pivotal local convergence properties such as the conditions for linear convergence, the region of convergence, the exact asymptotic rate of convergence, and the bound on the number of iterations needed to reach a certain level of accuracy. To demonstrate the applicability of the proposed approach, we present a recipe for the convergence analysis of projected gradient descent and demonstrate it via a beginning-to-end application of the recipe on four fundamental problems, namely, linear equality-constrained least squares, sparse recovery, least squares with the unit norm constraint, and matrix completion.
引用
收藏
页码:4061 / 4076
页数:16
相关论文
共 50 条
  • [31] Linear convergence of a nonmonotone projected gradient method for multiobjective optimization
    Xiaopeng Zhao
    Jen-Chih Yao
    [J]. Journal of Global Optimization, 2022, 82 : 577 - 594
  • [32] ON THE CONSTRAINED LINEAR LEAST-SQUARES PROBLEM - A PERSONAL VIEW
    HANSON, RJ
    [J]. APPLIED NUMERICAL MATHEMATICS, 1987, 3 (05) : 443 - 452
  • [33] GM(1,1;λ) with Constrained Linear Least Squares
    Yeh, Ming-Feng
    Chang, Ming-Hung
    [J]. AXIOMS, 2021, 10 (04)
  • [34] Gradient-descent iterative algorithm for solving exact and weighted least-squares solutions of rectangular linear systems
    Tansri, Kanjanaporn
    Chansangiam, Pattrawut
    [J]. AIMS MATHEMATICS, 2023, 8 (05): : 11781 - 11798
  • [35] Global Convergence of Gradient Descent for Deep Linear Residual Networks
    Wu, Lei
    Wang, Qingcan
    Ma, Chao
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [36] Convergence guarantees for forward gradient descent in the linear regression model
    Bos, Thijs
    Schmidt-Hieber, Johannes
    [J]. JOURNAL OF STATISTICAL PLANNING AND INFERENCE, 2024, 233
  • [37] Convergence analysis of projected gradient descent for Schatten-p nonconvex matrix recovery
    Yun Cai
    Song Li
    [J]. Science China Mathematics, 2015, 58 : 845 - 858
  • [38] Convergence analysis of projected gradient descent for Schatten-p nonconvex matrix recovery
    CAI Yun
    LI Song
    [J]. Science China Mathematics, 2015, 58 (04) : 845 - 858
  • [39] On-manifold projected gradient descent
    Mahler, Aaron
    Berry, Tyrus
    Stephens, Tom
    Antil, Harbir
    Merritt, Michael
    Schreiber, Jeanie
    Kevrekidis, Ioannis
    [J]. FRONTIERS IN COMPUTER SCIENCE, 2024, 6
  • [40] Projected Stein Variational Gradient Descent
    Chen, Peng
    Ghattas, Omar
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33