Communication-Efficient Distributed SGD With Compressed Sensing

被引:3
|
作者
Tang, Yujie [1 ]
Ramanathan, Vikram [1 ]
Zhang, Junshan [2 ]
Li, Na [1 ]
机构
[1] Harvard Univ, Sch Engn & Appl Sci, Allston, MA 02134 USA
[2] Arizona State Univ, Sch Elect Comp & Energy Engn, Tempe, AZ 85287 USA
来源
基金
美国国家科学基金会;
关键词
Servers; Compressed sensing; Sensors; Stochastic processes; Sparse matrices; Optimization; Convergence; Optimization algorithms; large-scale systems; distributed optimization; compressed sensing; CONVERGENCE;
D O I
10.1109/LCSYS.2021.3137859
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We consider large scale distributed optimization over a set of edge devices connected to a central server, where the limited communication bandwidth between the server and edge devices imposes a significant bottleneck for the optimization procedure. Inspired by recent advances in federated learning, we propose a distributed stochastic gradient descent (SGD) type algorithm that exploits the sparsity of the gradient, when possible, to reduce communication burden. At the heart of the algorithm is to use compressed sensing techniques for the compression of the local stochastic gradients at the device side; and at the server side, a sparse approximation of the global stochastic gradient is recovered from the noisy aggregated compressed local gradients. We conduct theoretical analysis on the convergence of our algorithm in the presence of noise perturbation incurred by the communication channels, and also conduct numerical experiments to corroborate its effectiveness.
引用
收藏
页码:2054 / 2059
页数:6
相关论文
共 50 条
  • [21] Communication-Efficient Federated Learning Based on Secret Sharing and Compressed Sensing
    Chen L.
    Xiao D.
    Yu Z.
    Huang H.
    Li M.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2022, 59 (11): : 2395 - 2407
  • [22] Communication-Efficient Decentralized Local SGD over Undirected Networks
    Qin, Tiancheng
    Etesami, S. Rasoul
    Uribe, Cesar A.
    2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2021, : 3361 - 3366
  • [23] Communication-Efficient Quantized SGD for Learning Polynomial Neural Network
    Yang, Zhanpeng
    Zhou, Yong
    Wu, Youlong
    Shi, Yuanming
    2021 IEEE INTERNATIONAL PERFORMANCE, COMPUTING, AND COMMUNICATIONS CONFERENCE (IPCCC), 2021,
  • [24] QSGD: Communication-Efficient SGD via Gradient Quantization and Encoding
    Alistarh, Dan
    Grubic, Demjan
    Li, Jerry Z.
    Tomioka, Ryota
    Vojnovic, Milan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [25] Hierarchical Privacy-Preserving and Communication-Efficient Compression via Compressed Sensing
    Huang, Hui
    Xiao, Di
    Wang, Mengdi
    2023 DATA COMPRESSION CONFERENCE, DCC, 2023, : 342 - 342
  • [26] Communication-efficient Federated Learning via Quantized Clipped SGD
    Jia, Ninghui
    Qu, Zhihao
    Ye, Baoliu
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2021, PT I, 2021, 12937 : 559 - 571
  • [27] Adaptive Top-K in SGD for Communication-Efficient Distributed Learning in Multi-Robot Collaboration
    Ruan, Mengzhe
    Yan, Guangfeng
    Xiao, Yuanzhang
    Song, Linqi
    Xu, Weitao
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2024, 18 (03) : 487 - 501
  • [28] Communication-Efficient Distributed Eigenspace Estimation
    Charisopoulos, Vasileios
    Benson, Austin R.
    Damle, Anil
    SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2021, 3 (04): : 1067 - 1092
  • [29] FAST AND COMMUNICATION-EFFICIENT DISTRIBUTED PCA
    Gang, Arpita
    Raja, Haroon
    Bajwa, Waheed U.
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 7450 - 7454
  • [30] Communication-efficient distributed oblivious transfer
    Beimel, Amos
    Chee, Yeow Meng
    Wang, Huaxiong
    Zhang, Liang Feng
    JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 2012, 78 (04) : 1142 - 1157