Family weak conjugate gradient algorithms and their convergence analysis for nonconvex functions

被引:0
|
作者
Gonglin Yuan
Xiaoliang Wang
Zhou Sheng
机构
[1] Guangxi University,College of Mathematics and Information Science
[2] Dalian University of Technology,School of Mathematical Sciences
[3] Nanjing University of Aeronautics and Astronautics,Department of Mathematics
来源
Numerical Algorithms | 2020年 / 84卷
关键词
Nonconvex functions; Sufficient descent; Trust region; Inexact line search; Global convergence; 90C26;
D O I
暂无
中图分类号
学科分类号
摘要
It is well-known that conjugate gradient algorithms are widely applied in many practical fields, for instance, engineering problems and finance models, as they are straightforward and characterized by a simple structure and low storage. However, challenging problems remain, such as the convergence of the PRP algorithms for nonconvexity under an inexact line search, obtaining a sufficient descent for all conjugate gradient methods, and other theory properties regarding global convergence and the trust region feature for nonconvex functions. This paper studies family conjugate gradient formulas based on the six classic formulas, PRP, HS, CD, FR, LS, and DY, where the family conjugate gradient algorithms have better theory properties than those of the formulas by themselves. Furthermore, this technique of the presented conjugate gradient formulas can be extended to any two-term conjugate gradient formula. This paper designs family conjugate gradient algorithms for nonconvex functions, which have the following features without other conditions: (i) the sufficient descent property holds, (ii) the trust region feature is true, and (iii) the global convergence holds under normal assumptions. Numerical results show that the given conjugate gradient algorithms are competitive with those of normal methods.
引用
收藏
页码:935 / 956
页数:21
相关论文
共 50 条