Dynamic scaling based preconditioning for truncated Newton methods in large scale unconstrained optimization

被引:11
|
作者
Roma, M [1 ]
机构
[1] Univ Roma La Sapienza, Dipartimento Informat & Sistemist A Ruberti, I-00185 Rome, Italy
来源
OPTIMIZATION METHODS & SOFTWARE | 2005年 / 20卷 / 06期
关键词
truncated Newton method; conjugate gradient (CG) method; preconditioning; row-column scaling; equilibrated matrix;
D O I
10.1080/10556780410001727709
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
This paper deals with the preconditioning of truncated Newton methods for the solution of large scale nonlinear unconstrained optimization problems. We focus on preconditioners which can be naturally embedded in the framework of truncated Newton methods, i.e . which can be built without storing the Hessian matrix of the function to be minimized, but only based upon information on the Hessian obtained by the product of the Hessian matrix times a vector. In particular we propose a diagonal preconditioning which enjoys this feature and which enables us to examine the effect of diagonal scaling on truncated Newton methods. In fact, this new preconditioner carries out a scaling strategy and it is based on the concept of equilibration of the data in linear systems of equations. An extensive numerical testing has been performed showing that the diagonal preconditioning strategy proposed is very effective. In fact, on most problems considered, the resulting diagonal preconditioned truncated Newton method performs better than both the unpreconditioned method and the one using an automatic preconditioner based on limited memory quasi-Newton updating (PREQN) recently proposed by Morales and Nocedal [Morales, J.L. and Nocedal, J., 2000, Automatic preconditioning by limited memory quasi-Newton updating. SIAM Journal on Optimization , 10, 1079-1096].
引用
收藏
页码:693 / 713
页数:21
相关论文
共 50 条