An adaptive conjugate gradient algorithm for large-scale unconstrained optimization

被引:36
|
作者
Andrei, Neculai [1 ]
机构
[1] Ctr Adv Modeling & Optimizat, Res Inst Informat, Bucharest 1, Romania
关键词
Unconstrained optimization; Condition number of a matrix; Adaptive conjugate gradient method; Numerical comparisons; GUARANTEED DESCENT; CONVERGENCE;
D O I
10.1016/j.cam.2015.07.003
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
An adaptive conjugate gradient algorithm is presented. The search direction is computed as the sum of the negative gradient and a vector determined by minimizing the quadratic approximation of objective function at the current point. Using a special approximation of the inverse Hessian of the objective function, which depends by a positive parameter, we get the search direction which satisfies both the sufficient descent condition and Dai-Liao's conjugacy condition. The parameter in the search direction is determined in an adaptive manner by minimizing the largest eigenvalue of the matrix defining it in order to cluster all the eigenvalues. The global convergence of the algorithm is proved for uniformly convex functions. Using a set of 800 unconstrained optimization test problems we prove that our algorithm is significantly more efficient and more robust than CG-DESCENT algorithm. By solving five applications from the MINPACK-2 test problem collection, with 10(6) variables, we show that the suggested adaptive conjugate gradient algorithm is top performer versus CC-DESCENT. (C) 2015 Elsevier B.V. All rights reserved.
引用
收藏
页码:83 / 91
页数:9
相关论文
共 50 条