A globally and quadratically convergent algorithm with efficient implementation for unconstrained optimization

被引:2
|
作者
Yang, Yaguang [1 ]
机构
[1] US NRC, Res Off, Rockville, MD 20850 USA
来源
COMPUTATIONAL & APPLIED MATHEMATICS | 2015年 / 34卷 / 03期
关键词
Global convergence; Quadratic convergence; Non-convex unconstrained optimization;
D O I
10.1007/s40314-014-0172-5
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, an efficient modified Newton-type algorithm is proposed for nonlinear unconstrained optimization problems. The modified Hessian is a convex combination of the identity matrix (for steepest descent algorithm) and the Hessian matrix (for Newton algorithm). The coefficients of the convex combination are dynamically chosen in every iteration. The algorithm is proved to be globally and quadratically convergent for (convex and non-convex) nonlinear functions. Efficient implementation is described. Numerical test on widely used CUTEr test problems is conducted for the new algorithm. The test results are compared with those obtained by MATLAB optimization toolbox function fminunc. The test results are also compared with those obtained by some established and stateof-theart algorithms, such as a limited memory BFGS, a descent and conjugate gradient algorithm, and a limited memory and descent conjugate gradient algorithm. The comparisons show that the new algorithm is promising.
引用
收藏
页码:1219 / 1236
页数:18
相关论文
共 50 条