Communication-Efficient Distributed Learning: An Overview

被引:18
|
作者
Cao, Xuanyu [1 ]
Basar, Tamer [2 ]
Diggavi, Suhas [3 ]
Eldar, Yonina C. [4 ]
Letaief, Khaled B. [1 ]
Poor, H. Vincent [5 ]
Zhang, Junshan [6 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Kowloon, Clear Water Bay, Hong Kong, Peoples R China
[2] Univ Illinois, Dept Elect & Comp Engn, Urbana, IL 61801 USA
[3] Univ Calif Los Angeles, Dept Elect & Comp Engn, Los Angeles, CA 90095 USA
[4] Weizmann Inst Sci, Dept Math & Comp Sci, IS-7610001 Rehovot, Israel
[5] Princeton Univ, Dept Elect & Comp Engn, Princeton, NJ 08544 USA
[6] Univ Calif Davis, Dept Elect & Comp Engn, Davis, CA 95616 USA
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Distance learning; Computer aided instruction; Servers; Distributed databases; Sensors; Resource management; Training; Distributed learning; communication efficiency; event-triggering; quantization; compression; sparsification; resource allocation; incentive mechanisms; single-task learning; multitask learning; meta-learning; online learning; ALTERNATING DIRECTION METHOD; ONLINE CONVEX-OPTIMIZATION; LINEAR CONVERGENCE; MULTIAGENT NETWORKS; SUBGRADIENT METHODS; TIME OPTIMIZATION; CONSENSUS; DESIGN; QUANTIZATION; ALGORITHMS;
D O I
10.1109/JSAC.2023.3242710
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Distributed learning is envisioned as the bedrock of next-generation intelligent networks, where intelligent agents, such as mobile devices, robots, and sensors, exchange information with each other or a parameter server to train machine learning models collaboratively without uploading raw data to a central entity for centralized processing. By utilizing the computation/communication capability of individual agents, the distributed learning paradigm can mitigate the burden at central processors and help preserve data privacy of users. Despite its promising applications, a downside of distributed learning is its need for iterative information exchange over wireless channels, which may lead to high communication overhead unaffordable in many practical systems with limited radio resources such as energy and bandwidth. To overcome this communication bottleneck, there is an urgent need for the development of communication-efficient distributed learning algorithms capable of reducing the communication cost and achieving satisfactory learning/optimization performance simultaneously. In this paper, we present a comprehensive survey of prevailing methodologies for communication-efficient distributed learning, including reduction of the number of communications, compression and quantization of the exchanged information, radio resource management for efficient learning, and game-theoretic mechanisms incentivizing user participation. We also point out potential directions for future research to further enhance the communication efficiency of distributed learning in various scenarios.
引用
收藏
页码:851 / 873
页数:23
相关论文
共 50 条
  • [1] More communication-efficient distributed sparse learning
    Zhou, Xingcai
    Yang, Guang
    [J]. INFORMATION SCIENCES, 2024, 668
  • [2] More communication-efficient distributed sparse learning
    Zhou, Xingcai
    Yang, Guang
    [J]. Information Sciences, 2024, 668
  • [3] Communication-Efficient Distributed Learning of Discrete Probability Distributions
    Diakonikolas, Ilias
    Grigorescu, Elena
    Li, Jerry
    Natarajan, Abhiram
    Onak, Krzysztof
    Schmidt, Ludwig
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [4] Communication-Efficient and Resilient Distributed Q-Learning
    Xie, Yijing
    Mou, Shaoshuai
    Sundaram, Shreyas
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (03) : 3351 - 3364
  • [5] Local Stochastic ADMM for Communication-Efficient Distributed Learning
    ben Issaid, Chaouki
    Elgabli, Anis
    Bennis, Mehdi
    [J]. 2022 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2022, : 1880 - 1885
  • [6] Communication-Efficient Distributed Cooperative Learning With Compressed Beliefs
    Toghani, Mohammad Taha
    Uribe, Cesar A.
    [J]. IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2022, 9 (03): : 1215 - 1226
  • [7] Ordered Gradient Approach for Communication-Efficient Distributed Learning
    Chen, Yicheng
    Sadler, Brian M.
    Blum, Rick S.
    [J]. PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [8] Communication-Efficient and Privacy-Aware Distributed Learning
    Gogineni, Vinay Chakravarthi
    Moradi, Ashkan
    Venkategowda, Naveen K. D.
    Werner, Stefan
    [J]. IEEE TRANSACTIONS ON SIGNAL AND INFORMATION PROCESSING OVER NETWORKS, 2023, 9 : 705 - 720
  • [9] Communication-efficient Distributed Learning for Large Batch Optimization
    Liu, Rui
    Mozafari, Barzan
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [10] Communication-Efficient and Byzantine-Robust Distributed Learning
    Ghosh, Avishek
    Maity, Raj Kumar
    Kadhe, Swanand
    Mazumdar, Arya
    Ramchandran, Kannan
    [J]. 2020 INFORMATION THEORY AND APPLICATIONS WORKSHOP (ITA), 2020,