Communication-Efficient Distributed Learning via Lazily Aggregated Quantized Gradients

被引:0
|
作者
Sun, Jun [1 ]
Chen, Tianyi [2 ]
Giannakis, Georgios B. [3 ]
Yang, Zaiyue [4 ]
机构
[1] Zhejiang Univ, Hangzhou 310027, Peoples R China
[2] Rensselaer Polytech Inst, Troy, NY 12180 USA
[3] Univ Minnesota, Minneapolis, MN 55455 USA
[4] Southern Univ Sci & Technol, Shenzhen 518055, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The present paper develops a novel aggregated gradient approach for distributed machine learning that adaptively compresses the gradient communication. The key idea is to first quantize the computed gradients, and then skip less informative quantized gradient communications by reusing outdated gradients. Quantizing and skipping result in 'lazy' worker-server communications, which justifies the term Lazily Aggregated Quantized gradient that is henceforth abbreviated as LAQ. Our LAQ can provably attain the same linear convergence rate as the gradient descent in the strongly convex case, while effecting major savings in the communication overhead both in transmitted bits as well as in communication rounds. Empirically, experiments with real data corroborate a significant communication reduction compared to existing gradient- and stochastic gradient-based algorithms.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] LAG: Lazily Aggregated Gradient for Communication-Efficient Distributed Learning
    Chen, Tianyi
    Giannakis, Georgios B.
    Sun, Tao
    Yin, Wotao
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [2] Lazily Aggregated Quantized Gradient Innovation for Communication-Efficient Federated Learning
    Sun, Jun
    Chen, Tianyi
    Giannakis, Georgios B.
    Yang, Qinmin
    Yang, Zaiyue
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (04) : 2031 - 2044
  • [3] SIGNGD with Error Feedback Meets Lazily Aggregated Technique:Communication-Efficient Algorithms for Distributed Learning
    Xiaoge Deng
    Tao Sun
    Feng Liu
    Dongsheng Li
    [J]. Tsinghua Science and Technology, 2022, 27 (01) : 174 - 185
  • [4] LAGC: Lazily Aggregated Gradient Coding for Straggler-Tolerant and Communication-Efficient Distributed Learning
    Zhang, Jingjing
    Simeone, Osvaldo
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (03) : 962 - 974
  • [5] SIGNGD with Error Feedback Meets Lazily Aggregated Technique: Communication-Efficient Algorithms for Distributed Learning
    Deng, Xiaoge
    Sun, Tao
    Liu, Feng
    Li, Dongsheng
    [J]. TSINGHUA SCIENCE AND TECHNOLOGY, 2022, 27 (01) : 174 - 185
  • [6] Communication-efficient Federated Learning via Quantized Clipped SGD
    Jia, Ninghui
    Qu, Zhihao
    Ye, Baoliu
    [J]. WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2021, PT I, 2021, 12937 : 559 - 571
  • [7] Communication-Efficient Distributed Optimization with Quantized Preconditioners
    Alimisis, Foivos
    Davies, Peter
    Alistarh, Dan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [8] Communication-Efficient Federated Learning via Quantized Compressed Sensing
    Oh, Yongjeong
    Lee, Namyoon
    Jeon, Yo-Seb
    Poor, H. Vincent
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (02) : 1087 - 1100
  • [9] GSASG: Global Sparsification With Adaptive Aggregated Stochastic Gradients for Communication-Efficient Federated Learning
    Du, Runmeng
    He, Daojing
    Ding, Zikang
    Wang, Miao
    Chan, Sammy
    Li, Xuru
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (17): : 28253 - 28266
  • [10] Communication-Efficient Distributed Learning: An Overview
    Cao, Xuanyu
    Basar, Tamer
    Diggavi, Suhas
    Eldar, Yonina C.
    Letaief, Khaled B.
    Poor, H. Vincent
    Zhang, Junshan
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 851 - 873