Scaled memoryless BFGS preconditioned steepest descent method for very large-scale unconstrained optimization

被引:2
|
作者
Leong, Wah June [1 ]
Abu Hassan, Malik [2 ]
机构
[1] Univ Putra Malaysia, Inst Math Res, Serdang 43400, Selangor, Malaysia
[2] Univ Putra Malaysia, Dept Math, Serdang 43400, Selangor, Malaysia
来源
关键词
Large-scale optimization; preconditioning; gradient method; scaled memoryless BFGS;
D O I
10.1080/02522667.2009.10699885
中图分类号
G25 [图书馆学、图书馆事业]; G35 [情报学、情报工作];
学科分类号
1205 ; 120501 ;
摘要
A preconditioned steepest descent (SD) method for solving very large (with dimensions up to 10(6)) unconstrained optimization problems is presented. The basic idea is to incorpo1 rate the preconditioning technique in the framework of the SD method. The preconditioner, which is also a scaled memoryless BFGS updating matrix is selected despite the oftenly scaling strategy on SD method. Then the scaled memoryless BFGS preconditioned SD direction can be computed without any additional storage compared with a standard scaled SD direction. In very mild conditions it is shown that, for uniformly convex functions, the method is globally and linearly convergent. Numerical results are also given to illustrate the use of such preconditioning within the SD method. Our numerical study shows that the new proposed preconditioned SD method is significantly outperformed the SD method with Oren-Luenberger scaling and the conjugate gradient method, and comparable to the limited memory BFGS method.
引用
收藏
页码:387 / 396
页数:10
相关论文
共 50 条