A conjugate gradient method with sufficient descent property

被引:0
|
作者
Liu, Hao [1 ]
Wang, Haijun [2 ]
Qian, Xiaoyan [1 ]
Rao, Feng [1 ]
机构
[1] Nanjing Tech Univ, Coll Sci, Nanjing 210009, Jiangsu, Peoples R China
[2] China Univ Min & Technol, Coll Sci, Xuzhou 221008, Jiangsu, Peoples R China
关键词
Conjugate gradient method; Descent direction; Memoryless quasi-Newton method; Global convergence; METRIC SSVM ALGORITHMS; QUASI-NEWTON METHODS; UNCONSTRAINED OPTIMIZATION; GLOBAL CONVERGENCE; LINE SEARCH; SECANT CONDITION; MINIMIZATION; PERFORMANCE;
D O I
10.1007/s11075-014-9946-5
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, a new nonlinear conjugate gradient method is proposed, whose search direction can be viewed as a simple approximation to that of the memoryless BFGS method. The search direction of the proposed method satisfies the sufficient descent property regardless of line search. Global convergence properties of the new method are explored on uniformly convex functions and general functions with the standard Wolfe line search. Numerical experiments are done to test the efficiency of the new method, which implies the new method is promising.
引用
收藏
页码:269 / 286
页数:18
相关论文
共 50 条