Nonlinear conjugate gradient methods with structured secant condition for nonlinear least squares problems

被引:26
|
作者
Kobayashi, Michiya
Narushima, Yasushi [1 ]
Yabe, Hiroshi [1 ]
机构
[1] Tokyo Univ Sci, Shinjuku Ku, Tokyo 1628601, Japan
基金
日本学术振兴会;
关键词
Least squares problems; Conjugate gradient method; Line search; Global convergence; Structured secant condition; QUASI-NEWTON METHODS; GLOBAL CONVERGENCE PROPERTIES; SUPERLINEAR CONVERGENCE; MINIMIZATION; ALGORITHMS;
D O I
10.1016/j.cam.2009.12.031
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
In this paper, we deal with conjugate gradient methods for solving nonlinear least squares problems. Several Newton-like methods have been studied for solving nonlinear least squares problems, which include the Gauss-Newton method, the Levenberg-Marquardt method and the structured quasi-Newton methods. On the other hand, conjugate gradient methods are appealing for general large-scale nonlinear optimization problems. By combining the structured secant condition and the idea of Dai and Liao (2001) [20], the present paper proposes conjugate gradient methods that make use of the structure of the Hessian of the objective function of nonlinear least squares problems. The proposed methods are shown to be globally convergent under some assumptions. Finally, some numerical results are given. (C) 2010 Elsevier B.V. All rights reserved.
引用
收藏
页码:375 / 397
页数:23
相关论文
共 50 条