GADMM: Fast and Communication Efficient Framework for Distributed Machine Learning

被引:0
|
作者
Elgabli, Anis
Park, Jihong
Bedi, Amrit S.
Bennis, Mehdi
Aggarwal, Vaneet
机构
关键词
OPTIMIZATION; CONSENSUS; CONVERGENCE; ALGORITHM; ADMM;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
When the data is distributed across multiple servers, lowering the communication cost between the servers (or workers) while solving the distributed learning problem is an important problem and is the focus of this paper. In particular, we propose a fast, and communication-efficient decentralized framework to solve the distributed machine learning (DML) problem. The proposed algorithm, Group Alternating Direction Method of Multipliers (GADMM) is based on the Alternating Direction Method of Multipliers (ADMM) framework. The key novelty in GADMM is that it solves the problem in a decentralized topology where at most half of the workers are competing for the limited communication resources at any given time. Moreover, each worker exchanges the locally trained model only with two neighboring workers, thereby training a global model with a lower amount of communication overhead in each exchange. We prove that GADMM converges to the optimal solution for convex loss functions, and numerically show that it converges faster and more communication-efficient than the state-of-the-art communication-efficient algorithms such as the Lazily Aggregated Gradient (LAG) and dual averaging, in linear and logistic regression tasks on synthetic and real datasets. Furthermore, we propose Dynamic GADMM (D-GADMM), a variant of GADMM, and prove its convergence under the time-varying network topology of the workers.
引用
收藏
页数:39
相关论文
共 50 条
  • [1] GADMM: Fast and Communication Efficient Framework for Distributed Machine Learning
    Elgabli, Anis
    Park, Jihong
    Bedi, Amrit S.
    Bennis, Mehdi
    Aggarwal, Vaneet
    Journal of Machine Learning Research, 2020, 21
  • [2] Q-GADMM: Quantized Group ADMM for Communication Efficient Decentralized Machine Learning
    Elgabli, Anis
    Park, Jihong
    Bedi, Amrit Singh
    Ben Issaid, Chaouki
    Bennis, Mehdi
    Aggarwal, Vaneet
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2021, 69 (01) : 164 - 181
  • [3] UbiNN: A Communication Efficient Framework for Distributed Machine Learning in Edge Computing
    Li, Ke
    Chen, Kexun
    Luo, Shouxi
    Zhang, Honghao
    Fan, Pingzhi
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2023, 10 (06): : 3368 - 3383
  • [4] SNAP: A Communication Efficient Distributed Machine Learning Framework for Edge Computing
    Zhao, Yangming
    Fan, Jingyuan
    Su, Lu
    Song, Tongyu
    Wang, Sheng
    Qiao, Chunming
    2020 IEEE 40TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS), 2020, : 584 - 594
  • [5] Q-GADMM: QUANTIZED GROUP ADMM FOR COMMUNICATION EFFICIENT DECENTRALIZED MACHINE LEARNING
    Elgabli, Anis
    Park, Jihong
    Bedi, Amrit S.
    Bennis, Mehdi
    Aggarwal, Vaneet
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 8876 - 8880
  • [6] Communication Efficient Framework for Decentralized Machine Learning
    Elgabli, Anis
    Park, Jihong
    Bedi, Amrit S.
    Bennis, Mehdi
    Aggarwal, Vaneet
    2020 54TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2020, : 47 - 51
  • [7] GRACE: A Compressed Communication Framework for Distributed Machine Learning
    Xu, Hang
    Ho, Chen-Yu
    Abdelmoniem, Ahmed M.
    Dutta, Aritra
    Bergou, El Houcine
    Karatsenidis, Konstantinos
    Canini, Marco
    Kalnis, Panos
    2021 IEEE 41ST INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2021), 2021, : 561 - 572
  • [8] Communication Efficient Distributed Machine Learning with the Parameter Server
    Li, Mu
    Andersen, David G.
    Smola, Alexander
    Yu, Kai
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 27 (NIPS 2014), 2014, 27
  • [9] A communication efficient distributed learning framework for smart environments
    Valerio, Lorenzo
    Passarella, Andrea
    Conti, Marco
    PERVASIVE AND MOBILE COMPUTING, 2017, 41 : 46 - 68
  • [10] A Flexible Framework for Communication-Efficient Machine Learning
    Khirirat, Sarit
    Magnusson, Sindri
    Aytekin, Arda
    Johansson, Mikael
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8101 - 8109