Subsampled inexact Newton methods for minimizing large sums of convex functions

被引:12
|
作者
Bellavia, Stefania [1 ]
Krejic, Natasa [2 ]
Jerinkic, Natasa Krklec [2 ]
机构
[1] Univ Florence, Dept Ind Engn, Viale Morgagni 40-44, I-50134 Florence, Italy
[2] Univ Novi Sad, Fac Sci, Dept Math & Informat, Trg Dositeja Obradov 4, Novi Sad 21000, Serbia
关键词
inexact Newton; subsampled Hessian; superlinear convergence; global convergence; mean square convergence; LINE-SEARCH; OPTIMIZATION METHODS; SAMPLE-SIZE; CONVERGENCE; ALGORITHM;
D O I
10.1093/imanum/drz027
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
This paper deals with the minimization of a large sum of convex functions by inexact Newton (IN) methods employing subsampled functions, gradients and Hessian approximations. The conjugate gradient method is used to compute the IN step and global convergence is enforced by a nonmonotone line-search procedure. The aim is to obtain methods with affordable costs and fast convergence. Assuming strongly convex functions, R-linear convergence and worst-case iteration complexity of the procedure are investigated when functions and gradients are approximated with increasing accuracy. A set of rules for the forcing parameters and subsample Hessian sizes are derived that ensure local q-linear/q-superlinear convergence of the proposed method. The random choice of the Hessian subsample is also considered and convergence in the mean square, both for finite and infinite sums of functions, is proved. Finally, the analysis of global convergence with asymptotic R-linear rate is extended to the case of the sum of convex functions and strongly convex objective function. Numerical results on well-known binary classification problems are also given. Adaptive strategies for selecting forcing terms and Hessian subsample size, streaming out of the theoretical analysis, are employed and the numerical results show that they yield effective IN methods.
引用
收藏
页码:2309 / 2341
页数:33
相关论文
共 50 条
  • [1] Subsampled inexact Newton methods for minimizing large sums of convex functions
    Bellavia S.
    Krejić N.
    Krklec Jerinkić N.
    1600, Oxford University Press (40): : 2309 - 2341
  • [2] Exact and inexact subsampled Newton methods for optimization
    Bollapragada, Raghu
    Byrd, Richard H.
    Nocedal, Jorge
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2019, 39 (02) : 545 - 578
  • [3] ACCELERATED REGULARIZED NEWTON METHODS FOR MINIMIZING COMPOSITE CONVEX FUNCTIONS
    Grapiglia, Geovani N.
    Nesterov, Yurii
    SIAM JOURNAL ON OPTIMIZATION, 2019, 29 (01) : 77 - 99
  • [4] Algorithms for Finding Copulas Minimizing Convex Functions of Sums
    Bernard, Carole
    McLeish, Don
    ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH, 2016, 33 (05)
  • [5] Inexact Proximal Methods for Weakly Convex Functions
    Khanh, Pham Duy
    Mordukhovich, Boris S.
    Phat, Vo Thanh
    Tran, Dat Ba
    arXiv, 2023,
  • [6] Inexact proximal methods for weakly convex functions
    Khanh, Pham Duy
    Mordukhovich, Boris S.
    Phat, Vo Thanh
    Tran, Dat Ba
    JOURNAL OF GLOBAL OPTIMIZATION, 2025, 91 (03) : 611 - 646
  • [7] Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions
    Nakayama, Shummin
    Narushima, Yasushi
    Yabe, Hiroshi
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 79 (01) : 127 - 154
  • [8] Inexact proximal memoryless quasi-Newton methods based on the Broyden family for minimizing composite functions
    Shummin Nakayama
    Yasushi Narushima
    Hiroshi Yabe
    Computational Optimization and Applications, 2021, 79 : 127 - 154
  • [9] Minimizing Uniformly Convex Functions by Cubic Regularization of Newton Method
    Nikita Doikov
    Yurii Nesterov
    Journal of Optimization Theory and Applications, 2021, 189 : 317 - 339
  • [10] INEXACT NEWTON METHODS
    DEMBO, RS
    EISENSTAT, SC
    STEIHAUG, T
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 1982, 19 (02) : 400 - 408