Communication Efficient Distributed Newton Method with Fast Convergence Rates

被引:1
|
作者
Liu, Chengchang [1 ]
Chen, Lesi [2 ]
Luo, Luo [2 ]
Lui, John C. S. [1 ]
机构
[1] Chinese Univ Hong Kong, Dept Comp Sci & Engn, Hong Kong, Peoples R China
[2] Fudan Univ, Sch Data Sci, Shanghai, Peoples R China
基金
中国国家自然科学基金;
关键词
Distributed Optimization; Second-Order Methods; CUBIC REGULARIZATION;
D O I
10.1145/3580305.3599280
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose a communication and computation efficient second-order method for distributed optimization. For each iteration, our method only requires O(d) communication complexity, where d is the problem dimension. We also provide theoretical analysis to show the proposed method has the similar convergence rate as the classical second-order optimization algorithms. Concretely, our method can find (epsilon, root dL epsilon)-second-order stationary points for nonconvex problem by O(root dL epsilon(-3/2)) iterations, where L is the Lipschitz constant of Hessian. Moreover, it enjoys a local superlinear convergence under the strongly-convex assumption. Experiments on both convex and nonconvex problems show that our proposed method performs significantly better than baselines.
引用
收藏
页码:1406 / 1416
页数:11
相关论文
共 50 条
  • [21] Convergence rates of a regularized Newton method in sound-hard inverse scattering
    Hohage, T
    SIAM JOURNAL ON NUMERICAL ANALYSIS, 1998, 36 (01) : 125 - 142
  • [22] Convergence rates for the iteratively regularized Gauss-Newton method in Banach spaces
    Kaltenbacher, Barbara
    Hofmann, Bernd
    INVERSE PROBLEMS, 2010, 26 (03)
  • [23] A convergence theorem for the Newton method
    Yakovlev M.N.
    Journal of Mathematical Sciences, 2000, 101 (4) : 3372 - 3375
  • [24] Convergence on a deformed Newton method
    Han, DF
    Wang, XH
    APPLIED MATHEMATICS AND COMPUTATION, 1998, 94 (01) : 65 - 72
  • [25] ON THE MONOTONE CONVERGENCE OF NEWTON METHOD
    POTRA, FA
    RHEINBOLDT, WC
    COMPUTING, 1986, 36 (1-2) : 81 - 90
  • [26] ACCELERATED CONVERGENCE IN NEWTON METHOD
    GERLACH, J
    SIAM REVIEW, 1994, 36 (02) : 272 - 276
  • [27] Distributed Newton Optimization With Maximized Convergence Rate
    Marelli, Damian Edgardo
    Xu, Yong
    Fu, Minyue
    Huang, Zenghong
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (10) : 5555 - 5562
  • [28] A Convergence Analysis of Distributed SGD with Communication-Efficient Gradient Sparsification
    Shi, Shaohuai
    Zhao, Kaiyong
    Wang, Qiang
    Tang, Zhenheng
    Chu, Xiaowen
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3411 - 3417
  • [29] Innovation Compression for Communication-Efficient Distributed Optimization With Linear Convergence
    Zhang, Jiaqi
    You, Keyou
    Xie, Lihua
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2023, 68 (11) : 6899 - 6906
  • [30] GADMM: Fast and Communication Efficient Framework for Distributed Machine Learning
    Elgabli, Anis
    Park, Jihong
    Bedi, Amrit S.
    Bennis, Mehdi
    Aggarwal, Vaneet
    Journal of Machine Learning Research, 2020, 21