Truncated partitioning group correction algorithms for large-scale sparse unconstrained optimization

被引:2
|
作者
Li, Junxiang [1 ]
Yu, Bo [1 ]
机构
[1] Dalian Univ Technol, Dept Appl Math, Dalian 116024, Peoples R China
关键词
unconstrained optimization; truncated Newton-like methods; inexact; sparsity; partition; conjugate gradient algorithms;
D O I
10.1016/j.amc.2007.01.023
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper gives approaches to solving large-scale sparse unconstrained optimization based on a successive partitioning group correction algorithm. In large-scale optimization, solving the Newton-like equations at each iteration can be expensive and may not be justified when far from a solution. Instead, an inaccurate solution to the Newton-like equations is computed using a conjugate gradient method. Besides, the methods also depend on a symmetric consistent partition of the columns of the Hessian matrix. A q-superlinear convergence result and an r-convergence rate estimate show that the methods have good local convergence properties. Global convergence is proven and the numerical results show that the methods may be competitive with some current used algorithms. (C) 2007 Elsevier Inc. All rights reserved.
引用
收藏
页码:242 / 254
页数:13
相关论文
共 50 条