On Asymptotic Linear Convergence of Projected Gradient Descent for Constrained Least Squares

被引:8
|
作者
Vu, Trung [1 ]
Raich, Raviv [1 ]
机构
[1] Oregon State Univ, Sch Elect Engn & Comp Sci, Corvallis, OR 97331 USA
关键词
Convergence; Optimization; Manifolds; Signal processing algorithms; Machine learning; Symmetric matrices; Sparse matrices; Asymptotic convergence rate; constrained least squares; local linear convergence; projected gradient descent; MATRIX; ALGORITHM; RESTORATION; MACHINE; MODEL;
D O I
10.1109/TSP.2022.3192142
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Many recent problems in signal processing and machine learning such as compressed sensing, image restoration, matrix/tensor recovery, and non-negative matrix factorization can be cast as constrained optimization. Projected gradient descent is a simple yet efficient method for solving such constrained optimization problems. Local convergence analysis furthers our understanding of its asymptotic behavior near the solution, offering sharper bounds on the convergence rate compared to global convergence analysis. However, local guarantees often appear scattered in problem-specific areas of machine learning and signal processing. This manuscript presents a unified framework for the local convergence analysis of projected gradient descent in the context of constrained least squares. The proposed analysis offers insights into pivotal local convergence properties such as the conditions for linear convergence, the region of convergence, the exact asymptotic rate of convergence, and the bound on the number of iterations needed to reach a certain level of accuracy. To demonstrate the applicability of the proposed approach, we present a recipe for the convergence analysis of projected gradient descent and demonstrate it via a beginning-to-end application of the recipe on four fundamental problems, namely, linear equality-constrained least squares, sparse recovery, least squares with the unit norm constraint, and matrix completion.
引用
收藏
页码:4061 / 4076
页数:16
相关论文
共 50 条
  • [1] On Local Linear Convergence of Projected Gradient Descent for Unit-Modulus Least Squares
    Vu, Trung
    Raich, Raviv
    Fu, Xiao
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2023, 71 : 3883 - 3897
  • [2] A unifying analysis of projected gradient descent for lp-constrained least squares
    Bahmani, S.
    Raj, B.
    [J]. APPLIED AND COMPUTATIONAL HARMONIC ANALYSIS, 2013, 34 (03) : 366 - 378
  • [3] Quantum gradient descent for linear systems and least squares
    Kerenidis, Iordanis
    Prakash, Anupam
    [J]. PHYSICAL REVIEW A, 2020, 101 (02)
  • [4] On Projected Stochastic Gradient Descent Algorithm with Weighted Averaging for Least Squares Regression
    Cohen, Kobi
    Nedic, Angelia
    Srikant, R.
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2017, 62 (11) : 5974 - 5981
  • [5] On Projected Stochastic Gradient Descent Algorithm with Weighted Averaging for Least Squares Regression
    Cohen, Kobi
    Nedic, Angelia
    Srikant, R.
    [J]. 2016 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING PROCEEDINGS, 2016, : 2314 - 2318
  • [6] Constrained Stochastic Gradient Descent for Large-scale Least Squares Problem
    Mu, Yang
    Ding, Wei
    Zhou, Tianyi
    Tao, Dacheng
    [J]. 19TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING (KDD'13), 2013, : 883 - 891
  • [7] The turbo decoder as a least squares cost gradient descent
    Walsh, JM
    Johnson, CR
    Regalia, PA
    [J]. 2005 IEEE 6TH WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS, 2005, : 675 - 679
  • [8] Asymptotic behaviour in linear least squares problems
    Osborne, M. R.
    [J]. IMA JOURNAL OF NUMERICAL ANALYSIS, 2010, 30 (01) : 241 - 247
  • [9] On the Convergence of the Generalized Linear Least Squares Algorithm
    C. Negoita
    R. A. Renaut
    [J]. BIT Numerical Mathematics, 2005, 45 : 137 - 158
  • [10] On the convergence of the generalized linear least squares algorithm
    Negoita, C
    Renaut, RA
    [J]. BIT NUMERICAL MATHEMATICS, 2005, 45 (01) : 137 - 158