Communication-Efficient Distributed Learning: An Overview

被引:18
|
作者
Cao, Xuanyu [1 ]
Basar, Tamer [2 ]
Diggavi, Suhas [3 ]
Eldar, Yonina C. [4 ]
Letaief, Khaled B. [1 ]
Poor, H. Vincent [5 ]
Zhang, Junshan [6 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Kowloon, Clear Water Bay, Hong Kong, Peoples R China
[2] Univ Illinois, Dept Elect & Comp Engn, Urbana, IL 61801 USA
[3] Univ Calif Los Angeles, Dept Elect & Comp Engn, Los Angeles, CA 90095 USA
[4] Weizmann Inst Sci, Dept Math & Comp Sci, IS-7610001 Rehovot, Israel
[5] Princeton Univ, Dept Elect & Comp Engn, Princeton, NJ 08544 USA
[6] Univ Calif Davis, Dept Elect & Comp Engn, Davis, CA 95616 USA
基金
中国国家自然科学基金; 美国国家科学基金会;
关键词
Distance learning; Computer aided instruction; Servers; Distributed databases; Sensors; Resource management; Training; Distributed learning; communication efficiency; event-triggering; quantization; compression; sparsification; resource allocation; incentive mechanisms; single-task learning; multitask learning; meta-learning; online learning; ALTERNATING DIRECTION METHOD; ONLINE CONVEX-OPTIMIZATION; LINEAR CONVERGENCE; MULTIAGENT NETWORKS; SUBGRADIENT METHODS; TIME OPTIMIZATION; CONSENSUS; DESIGN; QUANTIZATION; ALGORITHMS;
D O I
10.1109/JSAC.2023.3242710
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Distributed learning is envisioned as the bedrock of next-generation intelligent networks, where intelligent agents, such as mobile devices, robots, and sensors, exchange information with each other or a parameter server to train machine learning models collaboratively without uploading raw data to a central entity for centralized processing. By utilizing the computation/communication capability of individual agents, the distributed learning paradigm can mitigate the burden at central processors and help preserve data privacy of users. Despite its promising applications, a downside of distributed learning is its need for iterative information exchange over wireless channels, which may lead to high communication overhead unaffordable in many practical systems with limited radio resources such as energy and bandwidth. To overcome this communication bottleneck, there is an urgent need for the development of communication-efficient distributed learning algorithms capable of reducing the communication cost and achieving satisfactory learning/optimization performance simultaneously. In this paper, we present a comprehensive survey of prevailing methodologies for communication-efficient distributed learning, including reduction of the number of communications, compression and quantization of the exchanged information, radio resource management for efficient learning, and game-theoretic mechanisms incentivizing user participation. We also point out potential directions for future research to further enhance the communication efficiency of distributed learning in various scenarios.
引用
收藏
页码:851 / 873
页数:23
相关论文
共 50 条
  • [21] Communication-Efficient Distributed Eigenspace Estimation
    Charisopoulos, Vasileios
    Benson, Austin R.
    Damle, Anil
    [J]. SIAM JOURNAL ON MATHEMATICS OF DATA SCIENCE, 2021, 3 (04): : 1067 - 1092
  • [22] Communication-efficient distributed oblivious transfer
    Beimel, Amos
    Chee, Yeow Meng
    Wang, Huaxiong
    Zhang, Liang Feng
    [J]. JOURNAL OF COMPUTER AND SYSTEM SCIENCES, 2012, 78 (04) : 1142 - 1157
  • [23] Communication-Efficient Distributed Skyline Computation
    Zhang, Haoyu
    Zhang, Qin
    [J]. CIKM'17: PROCEEDINGS OF THE 2017 ACM CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2017, : 437 - 446
  • [24] Communication-efficient Distributed SGD with Sketching
    Ivkin, Nikita
    Rothchild, Daniel
    Ullah, Enayat
    Braverman, Vladimir
    Stoica, Ion
    Arora, Raman
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [25] Communication-Efficient Distributed Statistical Inference
    Jordan, Michael I.
    Lee, Jason D.
    Yang, Yun
    [J]. JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2019, 114 (526) : 668 - 681
  • [26] Communication-efficient distributed EM algorithm
    Liu, Xirui
    Wu, Mixia
    Xu, Liwen
    [J]. STATISTICAL PAPERS, 2024,
  • [27] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [28] Communication-efficient and Byzantine-robust distributed learning with statistical guarantee
    Zhou, Xingcai
    Chang, Le
    Xu, Pengfei
    Lv, Shaogao
    [J]. PATTERN RECOGNITION, 2023, 137
  • [29] Communication-Efficient Distributed Learning via Lazily Aggregated Quantized Gradients
    Sun, Jun
    Chen, Tianyi
    Giannakis, Georgios B.
    Yang, Zaiyue
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [30] Adaptive Top-K in SGD for Communication-Efficient Distributed Learning
    Ruan, Mengzhe
    Yan, Guangfeng
    Xiao, Yuanzhang
    Song, Linqi
    Xu, Weitao
    [J]. IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 5280 - 5285