Communication-Efficient Distributed SGD With Compressed Sensing

被引:3
|
作者
Tang, Yujie [1 ]
Ramanathan, Vikram [1 ]
Zhang, Junshan [2 ]
Li, Na [1 ]
机构
[1] Harvard Univ, Sch Engn & Appl Sci, Allston, MA 02134 USA
[2] Arizona State Univ, Sch Elect Comp & Energy Engn, Tempe, AZ 85287 USA
来源
基金
美国国家科学基金会;
关键词
Servers; Compressed sensing; Sensors; Stochastic processes; Sparse matrices; Optimization; Convergence; Optimization algorithms; large-scale systems; distributed optimization; compressed sensing; CONVERGENCE;
D O I
10.1109/LCSYS.2021.3137859
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider large scale distributed optimization over a set of edge devices connected to a central server, where the limited communication bandwidth between the server and edge devices imposes a significant bottleneck for the optimization procedure. Inspired by recent advances in federated learning, we propose a distributed stochastic gradient descent (SGD) type algorithm that exploits the sparsity of the gradient, when possible, to reduce communication burden. At the heart of the algorithm is to use compressed sensing techniques for the compression of the local stochastic gradients at the device side; and at the server side, a sparse approximation of the global stochastic gradient is recovered from the noisy aggregated compressed local gradients. We conduct theoretical analysis on the convergence of our algorithm in the presence of noise perturbation incurred by the communication channels, and also conduct numerical experiments to corroborate its effectiveness.
引用
收藏
页码:2054 / 2059
页数:6
相关论文
共 50 条
  • [31] Communication-Efficient Distributed Skyline Computation
    Zhang, Haoyu
    Zhang, Qin
    CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 437 - 446
  • [32] Communication-Efficient Distributed Learning: An Overview
    Cao, Xuanyu
    Basar, Tamer
    Diggavi, Suhas
    Eldar, Yonina C.
    Letaief, Khaled B.
    Poor, H. Vincent
    Zhang, Junshan
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 851 - 873
  • [33] Communication-Efficient Distributed Statistical Inference
    Jordan, Michael I.
    Lee, Jason D.
    Yang, Yun
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2019, 114 (526) : 668 - 681
  • [34] Communication-efficient distributed EM algorithm
    Liu, Xirui
    Wu, Mixia
    Xu, Liwen
    STATISTICAL PAPERS, 2024, : 5575 - 5592
  • [35] Communication-efficient local SGD with age-based worker selection
    Zhu, Feng
    Zhang, Jingjing
    Wang, Xin
    JOURNAL OF SUPERCOMPUTING, 2023, 79 (12): : 13794 - 13816
  • [36] Moshpit SGD: Communication-Efficient Decentralized Training on Heterogeneous Unreliable Devices
    Ryabinin, Max
    Gorbunov, Eduard
    Plokhotnyuk, Vsevolod
    Pekhimenko, Gennady
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [37] Communication-efficient local SGD with age-based worker selection
    Feng Zhu
    Jingjing Zhang
    Xin Wang
    The Journal of Supercomputing, 2023, 79 : 13794 - 13816
  • [38] Efficient-Adam: Communication-Efficient Distributed Adam
    Chen C.
    Shen L.
    Liu W.
    Luo Z.-Q.
    IEEE Transactions on Signal Processing, 2023, 71 : 3257 - 3266
  • [39] Communication-Efficient Distributed Mining of Association Rules
    Assaf Schuster
    Ran Wolff
    Data Mining and Knowledge Discovery, 2004, 8 : 171 - 196
  • [40] Communication-Efficient Distributed PCA by Riemannian Optimization
    Huang, Long-Kai
    Pan, Sinno Jialin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119