A modified scaled memoryless BFGS preconditioned conjugate gradient method for unconstrained optimization

被引:0
|
作者
Saman Babaie-Kafaki
机构
[1] Semnan University,Department of Mathematics, Faculty of Mathematics, Statistics and Computer Sciences
[2] Institute for Research in Fundamental Sciences (IPM),School of Mathematics
来源
4OR | 2013年 / 11卷
关键词
Unconstrained optimization; Scaled conjugate gradient method; Modified secant equation; Sufficient descent condition; Global convergence; 65K05; 90C53; 49M37;
D O I
暂无
中图分类号
学科分类号
摘要
In order to propose a scaled conjugate gradient method, the memoryless BFGS preconditioned conjugate gradient method suggested by Shanno and the spectral conjugate gradient method suggested by Birgin and Martínez are hybridized following Andrei’s approach. Since the proposed method is designed based on a revised form of a modified secant equation suggested by Zhang et al., one of its interesting features is applying the available function values in addition to the gradient values. It is shown that, for the uniformly convex objective functions, search directions of the method fulfill the sufficient descent condition which leads to the global convergence. Numerical comparisons of the implementations of the method and an efficient scaled conjugate gradient method proposed by Andrei, made on a set of unconstrained optimization test problems of the CUTEr collection, show the efficiency of the proposed modified scaled conjugate gradient method in the sense of the performance profile introduced by Dolan and Moré.
引用
收藏
页码:361 / 374
页数:13
相关论文
共 50 条