Exact and inexact subsampled Newton methods for optimization

被引:74
|
作者
Bollapragada, Raghu [1 ]
Byrd, Richard H. [2 ]
Nocedal, Jorge [1 ]
机构
[1] Northwestern Univ, Dept Ind Engn & Management Sci, Evanston, IL 60208 USA
[2] Univ Colorado, Dept Comp Sci, Boulder, CO 80309 USA
基金
美国国家科学基金会;
关键词
machine learning; subsampling; stochastic optimization;
D O I
10.1093/imanum/dry009
中图分类号
O29 [应用数学];
学科分类号
070104 ;
摘要
The paper studies the solution of stochastic optimization problems in which approximations to the gradient and Hessian are obtained through subsampling. We first consider Newton-like methods that employ these approximations and discuss how to coordinate the accuracy in the gradient and Hessian to yield a superlinear rate of convergence in expectation. The second part of the paper analyzes an inexact Newton method that solves linear systems approximately using the conjugate gradient (CG) method, and that samples the Hessian and not the gradient (the gradient is assumed to be exact). We provide a complexity analysis for this method based on the properties of the CG iteration and the quality of the Hessian approximation, and compare it with a method that employs a stochastic gradient iteration instead of the CG method. We report preliminary numerical results that illustrate the performance of inexact subsampled Newton methods on machine learning applications based on logistic regression.
引用
收藏
页码:545 / 578
页数:34
相关论文
共 50 条
  • [1] Subsampled inexact Newton methods for minimizing large sums of convex functions
    Bellavia, Stefania
    Krejic, Natasa
    Jerinkic, Natasa Krklec
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2020, 40 (04) : 2309 - 2341
  • [2] Subsampled inexact Newton methods for minimizing large sums of convex functions
    Bellavia S.
    Krejić N.
    Krklec Jerinkić N.
    1600, Oxford University Press (40): : 2309 - 2341
  • [3] PCG-inexact Newton methods for unary optimization
    Zhang, JZ
    Xue, Y
    Zhong, P
    Deng, NY
    EUROPEAN JOURNAL OF OPERATIONAL RESEARCH, 2002, 143 (02) : 419 - 431
  • [4] INEXACT NEWTON METHODS
    DEMBO, RS
    EISENSTAT, SC
    STEIHAUG, T
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 1982, 19 (02) : 400 - 408
  • [5] An investigation of Newton-Sketch and subsampled Newton methods
    Berahas, Albert S.
    Bollapragada, Raghu
    Nocedal, Jorge
    OPTIMIZATION METHODS & SOFTWARE, 2020, 35 (04): : 661 - 680
  • [6] On the convergence of inexact newton methods
    Idema, Reijer
    Lahaye, Domenico
    Vuik, Cornelis
    Lecture Notes in Computational Science and Engineering, 2015, 103 : 355 - 363
  • [7] Inexact Newton dogleg methods
    Pawlowski, Roger P.
    Simonis, Joseph P.
    Walker, Homer F.
    Shadid, John N.
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 2008, 46 (04) : 2112 - 2132
  • [8] On the convergence of inexact Newton methods
    Idema, Reijer
    Lahaye, Domenico
    Vuik, Cornelis
    Lecture Notes in Computational Science and Engineering, 2013, 103 : 355 - 363
  • [9] MUSIC: Accelerated Convergence for Distributed Optimization With Inexact and Exact Methods
    Wu, Mou
    Liao, Haibin
    Ding, Zhengtao
    Xiao, Yonggang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, : 1 - 15
  • [10] MUSIC: Accelerated Convergence for Distributed Optimization With Inexact and Exact Methods
    Wu, Mou
    Liao, Haibin
    Ding, Zhengtao
    Xiao, Yonggang
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (03) : 4893 - 4907