TRUST-REGION NEWTON-CG WITH STRONG SECOND-ORDER COMPLEXITY GUARANTEES FOR NONCONVEX OPTIMIZATION

被引:14
|
作者
Curtis, Frank E. [1 ]
Robinson, Daniel P. [1 ]
Royer, Clement W. [2 ]
Wright, Stephen J. [3 ]
机构
[1] Lehigh Univ, Dept Ind & Syst Engn, 200 W Packer Ave, Bethlehem, PA 18015 USA
[2] Univ PSL, Univ Paris Dauphine, CNRS, LAMSADE, F-75016 Paris, France
[3] Univ Wisconsin, Dept Comp Sci, 1210 W Dayton St, Madison, WI 53706 USA
关键词
smooth nonconvex optimization; trust-region methods; Newton's method; conjugate gradient method; Lanczos method; worst-case complexity; negative curvature; CUBIC-REGULARIZATION; ALGORITHMS; BOUNDS; NORM;
D O I
10.1137/19M130563X
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Worst-case complexity guarantees for nonconvex optimization algorithms have been a topic of growing interest. Multiple frameworks that achieve the best known complexity bounds among a broad class of first- and second-order strategies have been proposed. These methods have often been designed primarily with complexity guarantees in mind and, as a result, represent a departure from the algorithms that have proved to be the most effective in practice. In this paper, we consider trust-region Newton methods, one of the most popular classes of algorithms for solving nonconvex optimization problems. By introducing slight modifications to the original scheme, we obtain two methods-one based on exact subproblem solves and one exploiting inexact subproblem solves as in the popular "trust-region Newton-conjugate gradient" (trust-region Newton-CG) method-with iteration and operation complexity bounds that match the best known bounds for the aforementioned class of first- and second-order methods. The resulting trust-region Newton-CG method also retains the attractive practical behavior of classical trust-region Newton-CG, which we demonstrate with numerical comparisons on a standard benchmark test set.
引用
收藏
页码:518 / 544
页数:27
相关论文
共 50 条