Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates

被引:0
|
作者
Hiva Ghanbari
Katya Scheinberg
机构
[1] Lehigh University,Department of Industrial and Systems Engineering
关键词
Convex composite optimization; Strong convexity; Proximal quasi-Newton methods; Accelerated scheme; Convergence rates; Randomized coordinate descent;
D O I
暂无
中图分类号
学科分类号
摘要
A general, inexact, efficient proximal quasi-Newton algorithm for composite optimization problems has been proposed by Scheinberg and Tang (Math Program 160:495–529, 2016) and a sublinear global convergence rate has been established. In this paper, we analyze the global convergence rate of this method, in the both exact and inexact settings, in the case when the objective function is strongly convex. We also investigate a practical variant of this method by establishing a simple stopping criterion for the subproblem optimization. Furthermore, we consider an accelerated variant, based on FISTA of Beck and Teboulle (SIAM 2:183–202, 2009), to the proximal quasi-Newton algorithm. Jiang et al. (SIAM 22:1042–1064, 2012) considered a similar accelerated method, where the convergence rate analysis relies on very strong impractical assumptions on Hessian estimates. We present a modified analysis while relaxing these assumptions and perform a numerical comparison of the accelerated proximal quasi-Newton algorithm and the regular one. Our analysis and computational results show that acceleration may not bring any benefit in the quasi-Newton setting.
引用
收藏
页码:597 / 627
页数:30
相关论文
共 50 条
  • [1] Proximal quasi-Newton methods for regularized convex optimization with linear and accelerated sublinear convergence rates
    Ghanbari, Hiva
    Scheinberg, Katya
    [J]. COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2018, 69 (03) : 597 - 627
  • [2] Proximal quasi-Newton methods for nondifferentiable convex optimization
    Chen, XJ
    Fukushima, M
    [J]. MATHEMATICAL PROGRAMMING, 1999, 85 (02) : 313 - 334
  • [3] Proximal quasi-Newton methods for nondifferentiable convex optimization
    Xiaojun Chen
    Masao Fukushima
    [J]. Mathematical Programming, 1999, 85 : 313 - 334
  • [4] Accelerated Quasi-Newton Proximal Extragradient: Faster Rate for Smooth Convex Optimization
    Jiang, Ruichen
    Mokhtari, Aryan
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36, NEURIPS 2023, 2023,
  • [5] Stochastic proximal quasi-Newton methods for non-convex composite optimization
    Wang, Xiaoyu
    Wang, Xiao
    Yuan, Ya-xiang
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2019, 34 (05): : 922 - 948
  • [6] Rates of superlinear convergence for classical quasi-Newton methods
    Anton Rodomanov
    Yurii Nesterov
    [J]. Mathematical Programming, 2022, 194 : 159 - 190
  • [7] Rates of superlinear convergence for classical quasi-Newton methods
    Rodomanov, Anton
    Nesterov, Yurii
    [J]. MATHEMATICAL PROGRAMMING, 2022, 194 (1-2) : 159 - 190
  • [8] GLOBAL CONVERGENCE OF A CLASS OF QUASI-NEWTON METHODS ON CONVEX PROBLEMS
    BYRD, RH
    NOCEDAL, J
    YUAN, YX
    [J]. SIAM JOURNAL ON NUMERICAL ANALYSIS, 1987, 24 (05) : 1171 - 1190
  • [9] Global convergence of quasi-Newton methods for unconstrained optimization
    韩立兴
    刘光辉
    [J]. Science Bulletin, 1996, (07) : 529 - 533
  • [10] ON THE LOCAL CONVERGENCE OF QUASI-NEWTON METHODS FOR CONSTRAINED OPTIMIZATION
    BOGGS, PT
    TOLLE, JW
    WANG, P
    [J]. SIAM JOURNAL ON CONTROL AND OPTIMIZATION, 1982, 20 (02) : 161 - 171