First-Order Methods for Nonconvex Quadratic Minimization

被引:13
|
作者
Carmon, Yair [1 ]
Duchi, John C. [1 ,2 ]
机构
[1] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[2] Stanford Univ, Dept Stat, Stanford, CA 94305 USA
基金
美国国家科学基金会;
关键词
gradient descent; Krylov subspace methods; nonconvex quadratics; cubic regularization; trust-region methods; global optimization; Newton's method; nonasymptotic convergence; TRUST-REGION SUBPROBLEM; CUBIC REGULARIZATION; ALGORITHMS;
D O I
10.1137/20M1321759
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We consider minimization of indefinite quadratics with either trust-region (norm) constraints or cubic regularization. Despite the nonconvexity of these problems we prove that, under mild assumptions, gradient descent converges to their global solutions and give a nonasymptotic rate of convergence for the cubic variant. We also consider Krylov subspace solutions and establish sharp convergence guarantees to the solutions of both trust-region and cubic-regularized problems. Our rates mirror the behavior of these methods on convex quadratics and eigenvector problems, highlighting their scalability. When we use Krylov subspace solutions to approximate the cubic-regularized Newton step, our results recover the strongest known convergence guarantees to approximate second-order stationary points of general smooth nonconvex functions.
引用
收藏
页码:395 / 436
页数:42
相关论文
共 50 条
  • [41] Accelerated first-order methods for hyperbolic programming
    Renegar, James
    [J]. MATHEMATICAL PROGRAMMING, 2019, 173 (1-2) : 1 - 35
  • [42] Perturbed Fenchel duality and first-order methods
    David H. Gutman
    Javier F. Peña
    [J]. Mathematical Programming, 2023, 198 : 443 - 469
  • [43] Transient growth of accelerated first-order methods
    Samuelson, Samantha
    Mohammadi, Hesameddin
    Jovanovic, Mihailo R.
    [J]. 2020 AMERICAN CONTROL CONFERENCE (ACC), 2020, : 2858 - 2863
  • [44] GALERKIN METHODS FOR FIRST-ORDER HYPERBOLICS - EXAMPLE
    DUPONT, T
    [J]. SIAM JOURNAL ON NUMERICAL ANALYSIS, 1973, 10 (05) : 890 - 899
  • [45] FIRST-ORDER PENALTY METHODS FOR BILEVEL OPTIMIZATION
    Lu, Zhaosong
    Mei, Sanyou
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2024, 34 (02) : 1937 - 1969
  • [46] Distributed Learning Systems with First-Order Methods
    Liu, Ji
    Zhang, Ce
    [J]. FOUNDATIONS AND TRENDS IN DATABASES, 2020, 9 (01): : 1 - 100
  • [47] First-order methods for sparse covariance selection
    D'Aspremont, Alexandre
    Banerjee, Onureena
    El Ghaoui, Laurent
    [J]. SIAM JOURNAL ON MATRIX ANALYSIS AND APPLICATIONS, 2008, 30 (01) : 56 - 66
  • [48] SLM: A Smoothed First-Order Lagrangian Method for Structured Constrained Nonconvex Optimization
    Lu, Songtao
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [49] Control Interpretations for First-Order Optimization Methods
    Hu, Bin
    Lessard, Laurent
    [J]. 2017 AMERICAN CONTROL CONFERENCE (ACC), 2017, : 3114 - 3119
  • [50] Scalable First-Order Methods for Robust MDPs
    Grand-Clement, Julien
    Kroer, Christian
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 12086 - 12094