Consensus-Based Distributed Optimization: Practical Issues and Applications in Large-Scale Machine Learning

被引:0
|
作者
Tsianos, Konstantinos I. [1 ]
Lawlor, Sean [1 ]
Rabbat, Michael G. [1 ]
机构
[1] McGill Univ, Dept Elect & Comp Engn, Montreal, PQ, Canada
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper discusses practical consensus-based distributed optimization algorithms. In consensus-based optimization algorithms, nodes interleave local gradient descent steps with consensus iterations. Gradient steps drive the solution to a minimizer, while the consensus iterations synchronize the values so that all nodes converge to a network-wide optimum when the objective is convex and separable. The consensus update requires communication. If communication is synchronous and nodes wait to receive one message from each of their neighbors before updating then progress is limited by the slowest node. To be robust to failing or stalling nodes, asynchronous communications should be used. Asynchronous protocols using bi-directional communications cause deadlock, and so one-directional protocols are necessary. However, with one-directional asynchronous protocols it is no longer possible to guarantee the consensus matrix is doubly stochastic. At the same time it is essential that the coordination protocol achieve consensus on the average to avoid biasing the optimization objective. We report on experiments running Push-Sum Distributed Dual Averaging for convex optimization in a MPI cluster. The experiments illustrate the benefits of using asynchronous consensus-based distributed optimization when some nodes are unreliable and may fail or when messages experience time-varying delays.
引用
收藏
页码:1543 / 1550
页数:8
相关论文
共 50 条
  • [1] Distributed Newton Method for Large-Scale Consensus Optimization
    Tutunov, Rasul
    Bou-Ammar, Haitham
    Jadbabaie, Ali
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2019, 64 (10) : 3983 - 3994
  • [2] Coding for Large-Scale Distributed Machine Learning
    Xiao, Ming
    Skoglund, Mikael
    [J]. ENTROPY, 2022, 24 (09)
  • [3] Distributed consensus-based estimation and control of large-scale systems under gossip communication protocol
    Yu, Tao
    Xiong, Junlin
    [J]. JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2020, 357 (14): : 10010 - 10026
  • [4] A Consensus-Based Approach to the Distributed Learning
    Czarnowski, Ireneusz
    Jedrzejowicz, Piotr
    [J]. 2011 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2011, : 936 - 941
  • [5] Optimization Methods for Large-Scale Machine Learning
    Bottou, Leon
    Curtis, Frank E.
    Nocedal, Jorge
    [J]. SIAM REVIEW, 2018, 60 (02) : 223 - 311
  • [6] Information weighted consensus-based distributed particle filter for large-scale sparse wireless sensor networks
    Tang, Wenjun
    Zhang, Guoliang
    Zeng, Jing
    Yue, Yanan
    [J]. IET COMMUNICATIONS, 2014, 8 (17) : 3113 - 3121
  • [7] Consensus-based iterative learning of heterogeneous agents with application to distributed optimization
    Song, Qiang
    Meng, Deyuan
    Liu, Fang
    [J]. AUTOMATICA, 2022, 137
  • [8] Consensus-based distributed learning for robust convex optimization with a scenario approach
    Cao, Feilong
    Feng, Fan
    [J]. CONCURRENCY AND COMPUTATION-PRACTICE & EXPERIENCE, 2021, 33 (08):
  • [9] Consensus-Based Optimization on the Sphere: Convergence to Global Minimizers and Machine Learning
    Fornasier, Massimo
    Huang, Hui
    Pareschi, Lorenzo
    Suennen, Philippe
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2021, 22
  • [10] Machine Learning Based Graph Mining of Large-scale Network and Optimization
    Liu, Mingyue
    [J]. PROCEEDINGS OF 2021 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND INFORMATION SYSTEMS (ICAIIS '21), 2021,