DAve-QN: A Distributed Averaged Quasi-Newton Method with Local Superlinear Convergence Rate

被引:0
|
作者
Soori, Saeed [1 ]
Mischenko, Konstantin [2 ]
Mokhtari, Aryan [3 ]
Dehnavi, Maryam Mehri [1 ]
Gurbuzbalaban, Mert [4 ]
机构
[1] Univ Toronto, CS Dept, Toronto, ON, Canada
[2] KAUST Univ, CS Dept, Thuwal, Saudi Arabia
[3] Univ Texas Austin, ECE Dept, Austin, TX 78712 USA
[4] Rutgers State Univ, MSIS Dept, New Brunswick, NJ USA
基金
加拿大自然科学与工程研究理事会; 美国国家科学基金会;
关键词
OPTIMIZATION;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we consider distributed algorithms for solving the empirical risk minimization problem under the master/worker communication model. We develop a distributed asynchronous quasi-Newton algorithm that can achieve superlinear convergence. To our knowledge, this is the first distributed asynchronous algorithm with superlinear convergence guarantees. Our algorithm is communication-efficient in the sense that at every iteration the master node and workers communicate vectors of size O(p), where p is the dimension of the decision variable. The proposed method is based on a distributed asynchronous averaging scheme of decision vectors and gradients in a way to effectively capture the local Hessian information of the objective function. Our convergence theory supports asynchronous computations subject to both bounded delays and unbounded delays with a bounded time-average. Unlike in the majority of asynchronous optimization literature, we do not require choosing smaller stepsize when delays are huge. We provide numerical experiments that match our theoretical results and showcase significant improvement comparing to state-of-the-art distributed algorithms.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] AN INCREMENTAL QUASI-NEWTON METHOD WITH A LOCAL SUPERLINEAR CONVERGENCE RATE
    Mokhtari, Aryan
    Eisen, Mark
    Ribeiro, Alejandro
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 4039 - 4043
  • [2] IQN: AN INCREMENTAL QUASI-NEWTON METHOD WITH LOCAL SUPERLINEAR CONVERGENCE RATE
    Mokhtari, Aryan
    Eisen, Mark
    Ribeiro, Alejandro
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (02) : 1670 - 1698
  • [3] LOCAL AND SUPERLINEAR CONVERGENCE FOR PARTIALLY KNOWN QUASI-NEWTON METHODS
    Engels, John R.
    Martinez, Hector J.
    [J]. SIAM JOURNAL ON OPTIMIZATION, 1991, 1 (01) : 42 - 56
  • [4] Sharpened Quasi-Newton Methods: Faster Superlinear Rate and Larger Local Convergence Neighborhood
    Jin, Qiujiang
    Koppel, Alec
    Rajawat, Ketan
    Mokhtari, Aryan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022, : 10228 - 10250
  • [5] Local and superlinear convergence of structured quasi-Newton methods for nonlinear optimization
    Yabe, H
    Yamaki, N
    [J]. JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF JAPAN, 1996, 39 (04) : 541 - 557
  • [6] Adaptive Greedy Quasi-Newton with Superlinear Rate and Global Convergence Guarantee
    Du, Yubo
    You, Keyou
    [J]. 2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 7606 - 7611
  • [7] Local and superlinear convergence of quasi-Newton methods based on modified secant conditions
    Yabe, Hiroshi
    Ogasawara, Hideho
    Yoshino, Masayuki
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2007, 205 (01) : 617 - 632
  • [8] GREEDY QUASI-NEWTON METHODS WITH EXPLICIT SUPERLINEAR CONVERGENCE
    Rodomanov, Anton
    Nesterov, Yurii
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2021, 31 (01) : 785 - 811
  • [10] Rates of superlinear convergence for classical quasi-Newton methods
    Anton Rodomanov
    Yurii Nesterov
    [J]. Mathematical Programming, 2022, 194 : 159 - 190