Hybrid Conjugate Gradient Algorithm for Unconstrained Optimization

被引:60
|
作者
Andrei, N. [1 ]
机构
[1] Ctr Adv Modeling & Optimizat, Res Inst Informat, Bucharest, Romania
关键词
Unconstrained optimization; Hybrid conjugate gradient method; Conjugacy condition; Numerical comparisons; CONVERGENCE; MINIMIZATION; DESCENT;
D O I
10.1007/s10957-008-9505-0
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
In this paper a new hybrid conjugate gradient algorithm is proposed and analyzed. The parameter beta (k) is computed as a convex combination of the Polak-RibiSre-Polyak and the Dai-Yuan conjugate gradient algorithms, i.e. beta (k) (N) =(1-theta (k) )beta (k) (PRP) +theta (k) beta (k) (DY). The parameter theta (k) in the convex combination is computed in such a way that the conjugacy condition is satisfied, independently of the line search. The line search uses the standard Wolfe conditions. The algorithm generates descent directions and when the iterates jam the directions satisfy the sufficient descent condition. Numerical comparisons with conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that this hybrid computational scheme outperforms the known hybrid conjugate gradient algorithms.
引用
收藏
页码:249 / 264
页数:16
相关论文
共 50 条