Two modified three-term conjugate gradient methods with sufficient descent property

被引:35
|
作者
Babaie-Kafaki, Saman [1 ]
Ghanbari, Reza [2 ]
机构
[1] Semnan Univ, Dept Math, Fac Math Stat & Comp Sci, Semnan, Iran
[2] Ferdowsi Univ Mashhad, Fac Math Sci, Mashhad, Iran
关键词
Unconstrained optimization; Large-scale optimization; Conjugate gradient algorithm; Secant equation; Sufficient descent condition; Global convergence; UNCONSTRAINED OPTIMIZATION; GLOBAL CONVERGENCE; ALGORITHM;
D O I
10.1007/s11590-014-0736-8
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Based on the insight gained from the three-term conjugate gradient methods suggested by Zhang et al. (Optim Methods Softw 22:697-711, 2007) two nonlinear conjugate gradient methods are proposed, making modifications on the conjugate gradient methods proposed by Dai and Liao (Appl Math Optim 43:87-101, 2001), and Zhou and Zhang (Optim Methods Softw 21:707-714, 2006). The methods can be regarded as modified versions of two three-term conjugate gradient methods proposed by Sugiki et al. (J Optim Theory Appl 153:733-757, 2012) in which the search directions are computed using the secant equations in a way to achieve the sufficient descent property. One of the methods is shown to be globally convergent for uniformly convex objective functions while the other is shown to be globally convergent without convexity assumption on the objective function. Comparative numerical results demonstrating efficiency of the proposed methods are reported.
引用
收藏
页码:2285 / 2297
页数:13
相关论文
共 50 条