Online Learning Guided Curvature Approximation: A Quasi-Newton Method with Global Non-Asymptotic Superlinear Convergence

被引:0
|
作者
Jiang, Ruichen [1 ]
Jin, Qiujiang [1 ]
Mokhtari, Aryan [1 ]
机构
[1] Univ Texas Austin, Austin, TX 78712 USA
关键词
Quasi-Newton methods; non-asymptotic superlinear convergence rate; online learning; EXTRAGRADIENT;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Quasi-Newton algorithms are among the most popular iterative methods for solving unconstrained minimization problems, largely due to their favorable superlinear convergence property. However, existing results for these algorithms are limited as they provide either (i) a global convergence guarantee with an asymptotic superlinear convergence rate, or (ii) a local non-asymptotic superlinear rate for the case that the initial point and the initial Hessian approximation are chosen properly. In particular, no current analysis for quasi-Newton methods guarantees global convergence with an explicit superlinear convergence rate. In this paper, we close this gap and present the first globally convergent quasi-Newton method with an explicit non-asymptotic superlinear convergence rate. Unlike classical quasi-Newton methods, we build our algorithm upon the hybrid proximal extragradient method and propose a novel online learning framework for updating the Hessian approximation matrices. Specifically, guided by the convergence analysis, we formulate the Hessian approximation update as an online convex optimization problem in the space of matrices, and we relate the bounded regret of the online problem to the superlinear convergence of our method.
引用
收藏
页数:33
相关论文
共 14 条
  • [1] Non-asymptotic superlinear convergence of standard quasi-Newton methods
    Qiujiang Jin
    Aryan Mokhtari
    [J]. Mathematical Programming, 2023, 200 : 425 - 473
  • [2] Non-asymptotic superlinear convergence of standard quasi-Newton methods
    Jin, Qiujiang
    Mokhtari, Aryan
    [J]. MATHEMATICAL PROGRAMMING, 2023, 200 (01) : 425 - 473
  • [3] Distributed adaptive greedy quasi-Newton methods with explicit non-asymptotic convergence bounds
    Du, Yubo
    You, Keyou
    [J]. AUTOMATICA, 2024, 165
  • [4] AN INCREMENTAL QUASI-NEWTON METHOD WITH A LOCAL SUPERLINEAR CONVERGENCE RATE
    Mokhtari, Aryan
    Eisen, Mark
    Ribeiro, Alejandro
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4039 - 4043
  • [5] Adaptive Greedy Quasi-Newton with Superlinear Rate and Global Convergence Guarantee
    Du, Yubo
    You, Keyou
    [J]. 2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 7606 - 7611
  • [6] IQN: AN INCREMENTAL QUASI-NEWTON METHOD WITH LOCAL SUPERLINEAR CONVERGENCE RATE
    Mokhtari, Aryan
    Eisen, Mark
    Ribeiro, Alejandro
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) : 1670 - 1698
  • [7] Online Equivalence Learning Through A Quasi-Newton Method
    Le Capitaine, Hoel
    [J]. 2012 IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS (FUZZ-IEEE), 2012,
  • [8] DAve-QN: A Distributed Averaged Quasi-Newton Method with Local Superlinear Convergence Rate
    Soori, Saeed
    Mischenko, Konstantin
    Mokhtari, Aryan
    Dehnavi, Maryam Mehri
    Gurbuzbalaban, Mert
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108
  • [10] On the global convergence of an inexact quasi-Newton conditional gradient method for constrained nonlinear systems
    Goncalves, M. L. N.
    Oliveira, F. R.
    [J]. NUMERICAL ALGORITHMS, 2020, 84 (02) : 609 - 631