Multi-step nonlinear conjugate gradient methods for unconstrained minimization

被引:38
|
作者
Ford, John A. [2 ]
Narushima, Yasushi [1 ]
Yabe, Hiroshi [1 ]
机构
[1] Tokyo Univ Sci, Dept Math Informat Sci, Shinjuku Ku, Tokyo 1628601, Japan
[2] Univ Essex, Dept Comp Sci, Colchester CO4 3SQ, Essex, England
关键词
unconstrained optimization; conjugate gradient method; line search; global convergence; multi-step secant condition;
D O I
10.1007/s10589-007-9087-z
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
Conjugate gradient methods are appealing for large scale nonlinear optimization problems, because they avoid the storage of matrices. Recently, seeking fast convergence of these methods, Dai and Liao (Appl. Math. Optim. 43:87-101, 2001) proposed a conjugate gradient method based on the secant condition of quasi-Newton methods, and later Yabe and Takano (Comput. Optim. Appl. 28:203-225, 2004) proposed another conjugate gradient method based on the modified secant condition. In this paper, we make use of a multi-step secant condition given by Ford and Moghrabi (Optim. Methods Softw. 2:357-370, 1993; J. Comput. Appl. Math. 50:305-323, 1994) and propose two new conjugate gradient methods based on this condition. The methods are shown to be globally convergent under certain assumptions. Numerical results are reported.
引用
收藏
页码:191 / 216
页数:26
相关论文
共 50 条
  • [41] Two New Conjugate Gradient Methods for Unconstrained Optimization
    Liu, Meixing
    Ma, Guodong
    Yin, Jianghua
    [J]. COMPLEXITY, 2020, 2020
  • [42] A new family of conjugate gradient methods for unconstrained optimization
    Li, Ming
    Liu, Hongwei
    Liu, Zexian
    [J]. JOURNAL OF APPLIED MATHEMATICS AND COMPUTING, 2018, 58 (1-2) : 219 - 234
  • [43] Some modified conjugate gradient methods for unconstrained optimization
    Du, Xuewu
    Zhang, Peng
    Ma, Wenya
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2016, 305 : 92 - 114
  • [44] A new family of conjugate gradient methods for unconstrained optimization
    Ming Li
    Hongwei Liu
    Zexian Liu
    [J]. Journal of Applied Mathematics and Computing, 2018, 58 : 219 - 234
  • [45] Two New Conjugate Gradient Methods for Unconstrained Optimization
    Feng, Huantao
    Xiao, Wei
    [J]. PROCEEDINGS OF 2008 INTERNATIONAL PRE-OLYMPIC CONGRESS ON COMPUTER SCIENCE, VOL II: INFORMATION SCIENCE AND ENGINEERING, 2008, : 462 - 465
  • [46] Two-step conjugate gradient method for unconstrained optimization
    Dehghani, R.
    Bidabadi, N.
    [J]. COMPUTATIONAL & APPLIED MATHEMATICS, 2020, 39 (03):
  • [47] Multi-step Knowledge-Aided Iterative Conjugate Gradient Algorithms for DOA Estimation
    Pinto, Silvio F. B.
    de Lamare, Rodrigo C.
    [J]. CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2019, 38 (08) : 3841 - 3859
  • [48] NONLINEAR CONJUGATE GRADIENT METHODS
    Luksan, Ladislav
    Vlcek, Jan
    [J]. Programs and Algorithms of Numerical Mathematics 17, 2015, : 130 - 135
  • [49] Multi-step Knowledge-Aided Iterative Conjugate Gradient Algorithms for DOA Estimation
    Silvio F. B. Pinto
    Rodrigo C. de Lamare
    [J]. Circuits, Systems, and Signal Processing, 2019, 38 : 3841 - 3859
  • [50] Multi-step variance minimization in sequential tests
    Zheng Su
    Jiaqiao Hu
    Wei Zhu
    [J]. Statistics and Computing, 2008, 18 : 101 - 108