Distributed adaptive greedy quasi-Newton methods with explicit non-asymptotic convergence bounds

被引:0
|
作者
Du, Yubo [1 ,2 ]
You, Keyou [1 ,2 ]
机构
[1] Tsinghua Univ, Dept Automat, Beijing 100084, Peoples R China
[2] Tsinghua Univ, BNRist, Beijing 100084, Peoples R China
基金
中国国家自然科学基金;
关键词
Quasi-Newton methods; Non-asymptotic bounds; Superlinear convergence; Adaptive stepsize; Distributed optimization; DESCENT;
D O I
10.1016/j.automatica.2024.111629
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Though quasi -Newton methods have been extensively studied in the literature, they either suffer from local convergence or use a series of line searches for global convergence which is not easy to implement in the distributed setting. In this work, we first propose a line search free greedy quasiNewton (GQN) method with adaptive steps and establish explicit non -asymptotic bounds for both the global convergence rate and local superlinear rate. Our novel idea lies in the design of multiple GQN updates, which only involves computing Hessian -vector products, to control the Hessian approximation error, and a simple mechanism to adjust stepsizes to ensure the objective function improvement per iterate. Then, we extend it to the master-worker framework and propose a distributed adaptive GQN method whose communication cost is comparable with that of first -order methods, yet it retains the superb convergence property of its centralized counterpart. Finally, we demonstrate the advantages of our methods via numerical experiments. (c) 2024 Elsevier Ltd. All rights reserved.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Non-asymptotic superlinear convergence of standard quasi-Newton methods
    Jin, Qiujiang
    Mokhtari, Aryan
    [J]. MATHEMATICAL PROGRAMMING, 2023, 200 (01) : 425 - 473
  • [2] Non-asymptotic superlinear convergence of standard quasi-Newton methods
    Qiujiang Jin
    Aryan Mokhtari
    [J]. Mathematical Programming, 2023, 200 : 425 - 473
  • [3] GREEDY QUASI-NEWTON METHODS WITH EXPLICIT SUPERLINEAR CONVERGENCE
    Rodomanov, Anton
    Nesterov, Yurii
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (01) : 785 - 811
  • [4] Explicit Convergence Rates of Greedy and Random Quasi-Newton Methods
    Lin, Dachao
    Ye, Haishan
    Zhang, Zhihua
    [J]. Journal of Machine Learning Research, 2022, 23
  • [5] Explicit Convergence Rates of Greedy and Random Quasi-Newton Methods
    Lin, Dachao
    Ye, Haishan
    Zhang, Zhihua
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [6] Greedy and Random Quasi-Newton Methods with Faster Explicit Superlinear Convergence
    Lin, Dachao
    Ye, Haishan
    Zhang, Zhihua
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] Adaptive Greedy Quasi-Newton with Superlinear Rate and Global Convergence Guarantee
    Du, Yubo
    You, Keyou
    [J]. 2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 7606 - 7611
  • [8] Online Learning Guided Curvature Approximation: A Quasi-Newton Method with Global Non-Asymptotic Superlinear Convergence
    Jiang, Ruichen
    Jin, Qiujiang
    Mokhtari, Aryan
    [J]. THIRTY SIXTH ANNUAL CONFERENCE ON LEARNING THEORY, VOL 195, 2023, 195
  • [9] ON THE CONVERGENCE OF INEXACT QUASI-NEWTON METHODS
    MORET, I
    [J]. INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 1989, 28 (1-4) : 117 - 137
  • [10] H∞ bounds for quasi-Newton adaptive algorithm
    Kalyanasundaram, N.
    Jindal, Abhishek
    Gupta, Anindya
    [J]. SIGNAL PROCESSING, 2009, 89 (11) : 2304 - 2309