Acceleration of conjugate gradient algorithms for unconstrained optimization

被引:44
|
作者
Andrei, Neculai [1 ]
机构
[1] Ctr Adv Modeling & Optimizat, Res Inst Informat, Bucharest 1, Romania
关键词
Acceleration methods; Conjugate gradient; Wolfe line search; Line search gradient methods; Unconstrained optimization; EFFICIENT LINE SEARCH; DESCENT;
D O I
10.1016/j.amc.2009.03.020
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
Conjugate gradient methods are important for large-scale unconstrained optimization. This paper proposes an acceleration of these methods using a modi. cation of steplength. The idea is to modify in a multiplicative manner the steplength alpha(k), computed by Wolfe line search conditions, by means of a positive parameter eta(k), in such a way to improve the behavior of the classical conjugate gradient algorithms. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with some conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that the accelerated computational scheme outperform the corresponding conjugate gradient algorithms. (C) 2009 Elsevier Inc. All rights reserved.
引用
收藏
页码:361 / 369
页数:9
相关论文
共 50 条