Explicit Convergence Rates of Greedy and Random Quasi-Newton Methods

被引:0
|
作者
Lin, Dachao [1 ]
Ye, Haishan [2 ]
Zhang, Zhihua [3 ]
机构
[1] Peking Univ, Acad Adv Interdisciplinary Studies, Beijing, Peoples R China
[2] Xi An Jiao Tong Univ, Sch Management, Xian, Peoples R China
[3] Peking Univ, Sch Math Sci, Beijing, Peoples R China
基金
中国国家自然科学基金;
关键词
quasi-Newton methods; superlinear convergence; local convergence; rate of convergence; Broyden family; SR1; BFGS; DFP; SUPERLINEAR CONVERGENCE; GLOBAL CONVERGENCE;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Optimization is important in machine learning problems, and quasi-Newton methods have a reputation as the most efficient numerical methods for smooth unconstrained optimization. In this paper, we study the explicit superlinear convergence rates of quasi-Newton methods and address two open problems mentioned by Rodomanov and Nesterov (2021b). First, we extend Rodomanov and Nesterov (2021b)'s results to random quasi-Newton methods, which include common DFP, BFGS, SR1 methods. Such random methods employ a random direction for updating the approximate Hessian matrix in each iteration. Second, we focus on the specific quasi-Newton methods: SR1 and BFGS methods. We provide improved versions of greedy and random methods with provable better explicit (local) superlinear convergence rates. Our analysis is closely related to the approximation of a given Hessian matrix, unconstrained quadratic objective, as well as the general strongly convex, smooth, and strongly self-concordant functions.
引用
收藏
页数:40
相关论文
共 50 条
  • [1] Explicit Convergence Rates of Greedy and Random Quasi-Newton Methods
    Lin, Dachao
    Ye, Haishan
    Zhang, Zhihua
    [J]. Journal of Machine Learning Research, 2022, 23
  • [2] Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence
    Lin, Dachao
    Ye, Haishan
    Zhang, Zhihua
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [3] GREEDY QUASI-NEWTON METHODS WITH EXPLICIT SUPERLINEAR CONVERGENCE
    Rodomanov, Anton
    Nesterov, Yurii
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (01) : 785 - 811
  • [4] Distributed adaptive greedy quasi-Newton methods with explicit non-asymptotic convergence bounds
    Du, Yubo
    You, Keyou
    [J]. AUTOMATICA, 2024, 165
  • [5] Rates of superlinear convergence for classical quasi-Newton methods
    Anton Rodomanov
    Yurii Nesterov
    [J]. Mathematical Programming, 2022, 194 : 159 - 190
  • [6] Rates of superlinear convergence for classical quasi-Newton methods
    Rodomanov, Anton
    Nesterov, Yurii
    [J]. MATHEMATICAL PROGRAMMING, 2022, 194 (1-2) : 159 - 190
  • [7] Incremental Quasi-Newton Methods with Faster Superlinear Convergence Rates
    Liu, Zhuanghua
    Luo, Luo
    Low, Bryan Kian Hsiang
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 13, 2024, : 14097 - 14105
  • [8] ON THE CONVERGENCE OF INEXACT QUASI-NEWTON METHODS
    MORET, I
    [J]. INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 1989, 28 (1-4) : 117 - 137
  • [9] Adaptive Quasi-Newton and Anderson Acceleration Framework with Explicit Global (Accelerated) Convergence Rates
    Scieur, Damien
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [10] On the convergence of quasi-Newton methods for nonsmooth problems
    Lopes, VLR
    Martinez, JM
    [J]. NUMERICAL FUNCTIONAL ANALYSIS AND OPTIMIZATION, 1995, 16 (9-10) : 1193 - 1209