Two fundamental convergence theorems for nonlinear conjugate gradient methods and their applications

被引:0
|
作者
Han Jiye
Liu Guanghui
Sun Defeng
Yin Hongxia
机构
[1] Institute of Applied Mathematics,The Academy of Mathematics and Systems Sciences
[2] Institute of Applied Mathematics,the Chinese Academy of Sciences
[3] Northwestern University,Department of Industrial Engineering and Management Sciences
[4] University of New York Wales,School of Mathematics
[5] the Chinese Academy of Sciences,Hua Luo
关键词
Conjugate gradient method; descent condition; global convergence;
D O I
10.1007/BF02669682
中图分类号
学科分类号
摘要
Two fundamental convergence theorems are given for nonlinear conjugate gradient methods only under the descent condition. As a result, methods related to the Fletcher-Reeves algorithm still converge for parameters in a slightly wider range, in particular, for a parameter in its upper bound. For methods related to the Polak-Ribiére algorithm, it is shown that some negative values of the conjugate parameters do not prevent convergence. If the objective function is convex, some convergence results hold for the Hestenes-Stiefel algorithm.
引用
收藏
页码:38 / 46
页数:8
相关论文
共 50 条