First-Order Methods for Nonconvex Quadratic Minimization

被引:13
|
作者
Carmon, Yair [1 ]
Duchi, John C. [1 ,2 ]
机构
[1] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[2] Stanford Univ, Dept Stat, Stanford, CA 94305 USA
基金
美国国家科学基金会;
关键词
gradient descent; Krylov subspace methods; nonconvex quadratics; cubic regularization; trust-region methods; global optimization; Newton's method; nonasymptotic convergence; TRUST-REGION SUBPROBLEM; CUBIC REGULARIZATION; ALGORITHMS;
D O I
10.1137/20M1321759
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
We consider minimization of indefinite quadratics with either trust-region (norm) constraints or cubic regularization. Despite the nonconvexity of these problems we prove that, under mild assumptions, gradient descent converges to their global solutions and give a nonasymptotic rate of convergence for the cubic variant. We also consider Krylov subspace solutions and establish sharp convergence guarantees to the solutions of both trust-region and cubic-regularized problems. Our rates mirror the behavior of these methods on convex quadratics and eigenvector problems, highlighting their scalability. When we use Krylov subspace solutions to approximate the cubic-regularized Newton step, our results recover the strongest known convergence guarantees to approximate second-order stationary points of general smooth nonconvex functions.
引用
收藏
页码:395 / 436
页数:42
相关论文
共 50 条
  • [1] Optimized first-order methods for smooth convex minimization
    Kim, Donghwan
    Fessler, Jeffrey A.
    [J]. MATHEMATICAL PROGRAMMING, 2016, 159 (1-2) : 81 - 107
  • [2] Optimized first-order methods for smooth convex minimization
    Donghwan Kim
    Jeffrey A. Fessler
    [J]. Mathematical Programming, 2016, 159 : 81 - 107
  • [3] ON THE ORACLE COMPLEXITY OF FIRST-ORDER AND DERIVATIVE-FREE ALGORITHMS FOR SMOOTH NONCONVEX MINIMIZATION
    Cartis, Coralia
    Gould, Nicholas I. M.
    Toint, Philippe L.
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2012, 22 (01) : 66 - 86
  • [4] Convergence of First-Order Methods for Constrained Nonconvex Optimization with Dependent Data
    Alacaoglu, Ahmet
    Lyu, Hanbaek
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202 : 458 - 489
  • [5] Stochastic first-order methods for convex and nonconvex functional constrained optimization
    Digvijay Boob
    Qi Deng
    Guanghui Lan
    [J]. Mathematical Programming, 2023, 197 : 215 - 279
  • [6] Stochastic first-order methods for convex and nonconvex functional constrained optimization
    Boob, Digvijay
    Deng, Qi
    Lan, Guanghui
    [J]. MATHEMATICAL PROGRAMMING, 2023, 197 (01) : 215 - 279
  • [7] Quartic First-Order Methods for Low-Rank Minimization
    Dragomir, Radu-Alexandru
    d'Aspremont, Alexandre
    Bolte, Jerome
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2021, 189 (02) : 341 - 363
  • [8] Quartic First-Order Methods for Low-Rank Minimization
    Radu-Alexandru Dragomir
    Alexandre d’Aspremont
    Jérôme Bolte
    [J]. Journal of Optimization Theory and Applications, 2021, 189 : 341 - 363
  • [9] Efficient first-order methods for convex minimization: a constructive approach
    Drori, Yoel
    Taylor, Adrien B.
    [J]. MATHEMATICAL PROGRAMMING, 2020, 184 (1-2) : 183 - 220
  • [10] Efficient first-order methods for convex minimization: a constructive approach
    Yoel Drori
    Adrien B. Taylor
    [J]. Mathematical Programming, 2020, 184 : 183 - 220