Communication-efficient distributed optimization with adaptability to system heterogeneity

被引:0
|
作者
Yu, Ziyi [1 ]
Freris, Nikolaos M. [1 ]
机构
[1] Univ Sci & Technol China, Sch Comp Sci, Hefei, Peoples R China
关键词
ALTERNATING DIRECTION METHOD; CONVERGENCE; ALGORITHM; CLOCKS;
D O I
10.1109/CDC49753.2023.10383755
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider the setting of agents cooperatively minimizing the sum of local objectives plus a regularizer on a graph. This paper proposes a primaldual method in consideration of three distinctive attributes of real-life multi-agent systems, namely: (i) expensive communication, (ii) lack of synchronization, and (iii) system heterogeneity. In specific, we propose a distributed asynchronous algorithm with minimal communication cost, in which users commit variable amounts of local work on their respective subproblems. We illustrate this both theoretically and experimentally in the machine learning setting, where the agents hold private data and use a stochastic Newton method as the local solver. Under standard assumptions on Lipschitz continuous gradients and strong convexity, our analysis establishes linear convergence in expectation and characterizes the dependency of the rate on the number of local iterations. We proceed a step further to propose a simple means for tuning agents' hyperparameters locally, so as to adjust to heterogeneity and accelerate the overall convergence. Last, we validate our proposed method on a benchmark machine learning dataset to illustrate the merits in terms of computation, communication, and run-time saving as well as adaptability to heterogeneity.
引用
收藏
页码:3321 / 3326
页数:6
相关论文
共 50 条
  • [1] Communication-Efficient Distributed PCA by Riemannian Optimization
    Huang, Long-Kai
    Pan, Sinno Jialin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [2] Double Quantization for Communication-Efficient Distributed Optimization
    Huang, Longbo
    [J]. PROCEEDINGS OF THE 13TH EAI INTERNATIONAL CONFERENCE ON PERFORMANCE EVALUATION METHODOLOGIES AND TOOLS ( VALUETOOLS 2020), 2020, : 2 - 2
  • [3] Double Quantization for Communication-Efficient Distributed Optimization
    Yu, Yue
    Wu, Jiaxiang
    Huang, Longbo
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [4] Communication-Efficient Distributed Optimization with Quantized Preconditioners
    Alimisis, Foivos
    Davies, Peter
    Alistarh, Dan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [5] Gradient Sparsification for Communication-Efficient Distributed Optimization
    Wangni, Jianqiao
    Wang, Jialei
    Liu, Ji
    Zhang, Tong
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [6] Harvesting Curvatures for Communication-Efficient Distributed Optimization
    Cardoso, Diogo
    Li, Boyue
    Chi, Yuejie
    Xavier, Joao
    [J]. 2022 56TH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS, AND COMPUTERS, 2022, : 749 - 753
  • [7] Heterogeneity-aware and communication-efficient distributed statistical inference
    Duan, Rui
    Ning, Yang
    Chen, Yong
    [J]. BIOMETRIKA, 2022, 109 (01) : 67 - 83
  • [8] Manifold Identification for Ultimately Communication-Efficient Distributed Optimization
    Li, Yu-Sheng
    Chiang, Wei-Lin
    Lee, Ching-pei
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [9] Communication-efficient Distributed Learning for Large Batch Optimization
    Liu, Rui
    Mozafari, Barzan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Adaptive Bit Allocation for Communication-Efficient Distributed Optimization
    Reisizadeh, Hadi
    Touri, Behrouz
    Mohajer, Soheil
    [J]. 2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2021, : 1994 - 2001