A double parameter self-scaling memoryless BFGS method for unconstrained optimization

被引:0
|
作者
Neculai Andrei
机构
[1] Academy of Romanian Scientists,Center for Advanced Modeling and Optimization
来源
关键词
Unconstrained optimization; Self-scaling memoryless BFGS method; Global convergence; Numerical comparisons; 49M7; 49M10; 65K05; 90C30;
D O I
暂无
中图分类号
学科分类号
摘要
A double parameter self-scaling memoryless BFGS method for unconstrained optimization is presented. In this method, the first two terms of the self-scaling memoryless BFGS matrix are scaled with a positive parameter, while the third one is scaled with another positive parameter. The first parameter scaling the first two terms is determined to cluster the eigenvalues of the memoryless BFGS matrix. The second parameter scaling the third term is computed as a preconditioner to the Hessian of the minimizing function combined with the minimization of the conjugacy condition to shift the large eigenvalues of the self-scaling memoryless BFGS matrix to the left. The stepsize is determined by the Wolfe line search conditions. The global convergence of this method is proved, assuming that the minimizing function is uniformly convex. The preliminary computational experiments on a set of 80 unconstrained optimization test functions show that this algorithm is more efficient and more robust than the self-scaling BFGS updates by Oren and Luenberger and by Oren and Spedicato. Subject to the CPU time metric, CG-DESCENT is top performer. Comparisons with L-BFGS show that our algorithm is more efficient.
引用
收藏
相关论文
共 50 条