Another nonlinear conjugate gradient algorithm for unconstrained optimization

被引:14
|
作者
Andrei, Neculai [1 ]
机构
[1] Ctr Adv Modelling & Optimizat, Res Inst Informat, Bucharest, Romania
来源
OPTIMIZATION METHODS & SOFTWARE | 2009年 / 24卷 / 01期
关键词
unconstrained optimization; conjugate gradient method; sufficient descent condition; conjugacy condition; numerical comparisons; CONVERGENCE CONDITIONS; MINIMIZATION; DESCENT;
D O I
10.1080/10556780802393326
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
A nonlinear conjugate gradient algorithm which is a modification of the Dai and Yuan [A nonlinear conjugate gradient method with a strong global convergence property, SIAM J. Optim. 10 (1999), pp 177-182] conjugate gradient algorithm satisfying a parameterized sufficient descent condition with a parameter delta(k) is proposed. The parameter delta(k) is computed by means of the conjugacy condition, thus an algorithm which is a positive multiplicative modification of the Hestenes and Stiefel [Methods of conjugate gradients for solving linear systems, J. Res. Nat. Bur. Standards Sec. B 48 (1952). pp. 409-436] algorithm is obtained. The algorithm can be viewed as an adaptive version of the Dai and Liao [New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim. 43 (2001). pp. 87-101] conjugate gradient algorithm. Close to our computational scheme is the conjugate gradient algorithm recently proposed by Hager and Zhang [A new conjugate gradient method with guaranteed descent and an efficient line search. SIAM J. Optim. 16 (2005), pp. 170-192]. Computational results, for a set consisting of 750 unconstrained optimization test problems. show that this new conjugate gradient algorithm substantially outperforms the known conjugate gradient algorithms.
引用
收藏
页码:89 / 104
页数:16
相关论文
共 50 条