Subsampled inexact Newton methods for minimizing large sums of convex functions

被引:12
|
作者
Bellavia, Stefania [1 ]
Krejic, Natasa [2 ]
Jerinkic, Natasa Krklec [2 ]
机构
[1] Univ Florence, Dept Ind Engn, Viale Morgagni 40-44, I-50134 Florence, Italy
[2] Univ Novi Sad, Fac Sci, Dept Math & Informat, Trg Dositeja Obradov 4, Novi Sad 21000, Serbia
关键词
inexact Newton; subsampled Hessian; superlinear convergence; global convergence; mean square convergence; LINE-SEARCH; OPTIMIZATION METHODS; SAMPLE-SIZE; CONVERGENCE; ALGORITHM;
D O I
10.1093/imanum/drz027
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper deals with the minimization of a large sum of convex functions by inexact Newton (IN) methods employing subsampled functions, gradients and Hessian approximations. The conjugate gradient method is used to compute the IN step and global convergence is enforced by a nonmonotone line-search procedure. The aim is to obtain methods with affordable costs and fast convergence. Assuming strongly convex functions, R-linear convergence and worst-case iteration complexity of the procedure are investigated when functions and gradients are approximated with increasing accuracy. A set of rules for the forcing parameters and subsample Hessian sizes are derived that ensure local q-linear/q-superlinear convergence of the proposed method. The random choice of the Hessian subsample is also considered and convergence in the mean square, both for finite and infinite sums of functions, is proved. Finally, the analysis of global convergence with asymptotic R-linear rate is extended to the case of the sum of convex functions and strongly convex objective function. Numerical results on well-known binary classification problems are also given. Adaptive strategies for selecting forcing terms and Hessian subsample size, streaming out of the theoretical analysis, are employed and the numerical results show that they yield effective IN methods.
引用
收藏
页码:2309 / 2341
页数:33
相关论文
共 50 条
  • [21] Efficiently preconditioned inexact Newton methods for large symmetric eigenvalue problems
    Bergamaschi, L.
    Martinez, A.
    OPTIMIZATION METHODS & SOFTWARE, 2015, 30 (02): : 301 - 322
  • [22] PROXIMAL NEWTON-TYPE METHODS FOR MINIMIZING COMPOSITE FUNCTIONS
    Lee, Jason D.
    Sun, Yuekai
    Saunders, Michael A.
    SIAM JOURNAL ON OPTIMIZATION, 2014, 24 (03) : 1420 - 1443
  • [23] Subsampled Hessian Newton Methods for Supervised Learning
    Wang, Chien-Chih
    Huang, Chun-Heng
    Lin, Chih-Jen
    NEURAL COMPUTATION, 2015, 27 (08) : 1766 - 1795
  • [24] REGULARIZED NEWTON METHODS FOR MINIMIZING FUNCTIONS WITH HOLDER CONTINUOUS HESSIANS
    Grapiglia, G. N.
    Nesterov, Y. U.
    SIAM JOURNAL ON OPTIMIZATION, 2017, 27 (01) : 478 - 506
  • [25] Inexact semismooth, newton methods for large-scale complementarity problems
    Kanzow, C
    OPTIMIZATION METHODS & SOFTWARE, 2004, 19 (3-4): : 309 - 325
  • [26] Globalized inexact proximal Newton-type methods for nonconvex composite functions
    Christian Kanzow
    Theresa Lechner
    Computational Optimization and Applications, 2021, 78 : 377 - 410
  • [27] Inexact Newton methods for model simulation
    Bellavia, Stefania
    Magheri, Silvia
    Miani, Claudia
    INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2011, 88 (14) : 2969 - 2987
  • [28] ON SEMILOCAL CONVERGENCE OF INEXACT NEWTON METHODS
    Xueping Guo (Department of Mathematics
    JournalofComputationalMathematics, 2007, (02) : 231 - 242
  • [29] Concerning the convergence of inexact Newton methods
    Argyros, IK
    JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 1997, 79 (02) : 235 - 247
  • [30] On semilocal convergence of inexact Newton methods
    Department of Mathematics, East China Normal University, Shanghai 200062, China
    J Comput Math, 2007, 2 (231-242):