Communication-Efficient Distributed Cooperative Learning With Compressed Beliefs

被引:4
|
作者
Toghani, Mohammad Taha [1 ]
Uribe, Cesar A. [1 ]
机构
[1] Rice Univ, Dept Elect & Comp Engn, Houston, TX 77005 USA
来源
关键词
Algorithm design and analysis; Bayesian update; compressed communication; distributed algorithms; INFERENCE; CONSENSUS;
D O I
10.1109/TCNS.2022.3198791
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this article, we study the problem of distributed cooperative learning, where a group of agents seeks to agree on a set of hypotheses that best describes a sequence of private observations. In the scenario where the set of hypotheses is large, we propose a belief up-date rule where agents share compressed (either sparse or quantized) beliefs with an arbitrary positive compression rate. Our algorithm leverages a unified communication rule that enables agents to access wide-ranging compression operators as black-box modules. We prove the almost sure asymptotic convergence of beliefs on the set of optimal hypotheses. Additionally, we show a nonasymptotic, explicit, and linear concentration rate in probability of the beliefs on the optimal hypothesis set. We provide numerical experiments to illustrate the communication benefits of our method. The simulation results show that the number of transmitted bits can be reduced to 5%-10% of the noncompressed method in the studied scenarios.
引用
收藏
页码:1215 / 1226
页数:12
相关论文
共 50 条
  • [1] Communication-Efficient Distributed SGD With Compressed Sensing
    Tang, Yujie
    Ramanathan, Vikram
    Zhang, Junshan
    Li, Na
    [J]. IEEE CONTROL SYSTEMS LETTERS, 2022, 6 : 2054 - 2059
  • [2] Communication-Efficient Distributed Learning: An Overview
    Cao, Xuanyu
    Basar, Tamer
    Diggavi, Suhas
    Eldar, Yonina C.
    Letaief, Khaled B.
    Poor, H. Vincent
    Zhang, Junshan
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 851 - 873
  • [3] AC-SGD: Adaptively Compressed SGD for Communication-Efficient Distributed Learning
    Yan, Guangfeng
    Li, Tan
    Huang, Shao-Lun
    Lan, Tian
    Song, Linqi
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2022, 40 (09) : 2678 - 2693
  • [4] Communication-Efficient Federated Learning Based on Compressed Sensing
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (20) : 15531 - 15541
  • [5] Communication-efficient distributed cubic Newton with compressed lazy Hessian
    Zhang, Zhen
    Che, Keqin
    Yang, Shaofu
    Xu, Wenying
    [J]. NEURAL NETWORKS, 2024, 174
  • [6] More communication-efficient distributed sparse learning
    Zhou, Xingcai
    Yang, Guang
    [J]. INFORMATION SCIENCES, 2024, 668
  • [7] More communication-efficient distributed sparse learning
    Zhou, Xingcai
    Yang, Guang
    [J]. Information Sciences, 2024, 668
  • [8] Communication-efficient Federated Learning with Cooperative Filter Selection
    Yang, Zhao
    Sun, Qingshuang
    [J]. 2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2172 - 2176
  • [9] A Cooperative Analysis to Incentivize Communication-Efficient Federated Learning
    Li, Youqi
    Li, Fan
    Yang, Song
    Zhang, Chuan
    Zhu, Liehuang
    Wang, Yu
    [J]. IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 10175 - 10190
  • [10] Compressed Distributed Gradient Descent: Communication-Efficient Consensus over Networks
    Zhang, Xin
    Liu, Jia
    Zhu, Zhengyuan
    Bentley, Elizabeth S.
    [J]. IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2019), 2019, : 2431 - 2439