Distributed Newton Method for Large-Scale Consensus Optimization

被引:28
|
作者
Tutunov, Rasul [1 ]
Bou-Ammar, Haitham [1 ]
Jadbabaie, Ali [2 ]
机构
[1] Huawei Technol Res & Dev, Shenzhen, Peoples R China
[2] MIT, Inst Data Syst & Soc, 77 Massachusetts Ave, Cambridge, MA 02139 USA
关键词
Convergence; distributed algorithms; SUBGRADIENT METHODS; ALGORITHM; DESCENT;
D O I
10.1109/TAC.2019.2907711
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we propose a distributed Newton method for decenteralized optimization of large sums of convex functions. Our proposed method is based on creating a set of separable finite sum minimization problems by utilizing a decomposition technique known as Global Consensus that distributes the computation across nodes of a graph and enforces a consensus constraint among the separated variables. The key idea is to exploit the sparsity of the dual Hessian and recast the computation of the Newton step as one of the efficiently solving symmetric diagonally dominant linear equations. We show that our method outperforms the state-of-the-art algorithms, including ADMM. We validate our algorithm both theoretically and empirically. On the theory side, we demonstrate that our algorithm exhibits superlinear convergence within a neighborhood of optimality. Empirically, we show the superiority of this new method on a variety of large-scale optimization problems. The proposed approach is scalable to large problems and has a low communication overhead.
引用
收藏
页码:3983 / 3994
页数:12
相关论文
共 50 条
  • [1] A Newton Based Distributed Optimization Method with Local Interactions for Large-Scale Networked Optimization Problems
    HomChaudhuri, Baisravan
    Kumar, Manish
    [J]. 2014 AMERICAN CONTROL CONFERENCE (ACC), 2014, : 4336 - 4341
  • [2] A Newton consensus method for distributed optimization
    Guay, Martin
    [J]. IFAC PAPERSONLINE, 2020, 53 (02): : 5417 - 5422
  • [3] Towards a discrete Newton method with memory for large-scale optimization
    Byrd, RH
    Nocedal, J
    Zhu, CY
    [J]. NONLINEAR OPTIMIZATION AND APPLICATIONS, 1996, : 1 - 12
  • [4] A STOCHASTIC QUASI-NEWTON METHOD FOR LARGE-SCALE OPTIMIZATION
    Byrd, R. H.
    Hansen, S. L.
    Nocedal, Jorge
    Singer, Y.
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2016, 26 (02) : 1008 - 1031
  • [5] A Distributed Newton Method for Processing Signals Defined on the Large-Scale Networks
    Yanhai Zhang
    Junzheng Jiang
    Haitao Wang
    Mou Ma
    [J]. China Communications, 2023, 20 (05) : 315 - 329
  • [6] A Distributed Newton Method for Processing Signals Defined on the Large-Scale Networks
    Zhang, Yanhai
    Jiang, Junzheng
    Wang, Haitao
    Ma, Mou
    [J]. CHINA COMMUNICATIONS, 2023, 20 (05) : 315 - 329
  • [7] A COMMUNICATION EFFICIENT QUASI-NEWTON METHOD FOR LARGE-SCALE DISTRIBUTED MULTI-AGENT OPTIMIZATION
    Li, Yichuan
    Voulgaris, Petros G.
    Freris, Nikolaos M.
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4268 - 4272
  • [8] MULTI-AGENT DISTRIBUTED LARGE-SCALE OPTIMIZATION BY INEXACT CONSENSUS ALTERNATING DIRECTION METHOD OF MULTIPLIERS
    Chang, Tsung-Hui
    Hong, Mingyi
    Wang, Xiangfeng
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [9] Large-Scale and Distributed Optimization Preface
    Giselsson, Pontus
    Rantzer, Anders
    [J]. LARGE-SCALE AND DISTRIBUTED OPTIMIZATION, 2018, 2227 : V - V
  • [10] Large-Scale and Distributed Optimization: An Introduction
    Giselsson, Pontus
    Rantzer, Anders
    [J]. LARGE-SCALE AND DISTRIBUTED OPTIMIZATION, 2018, 2227 : 1 - 10