Communication-Efficient Federated DNN Training: Convert, Compress, Correct

被引:0
|
作者
Chen, Zhong-Jing [1 ]
Hernandez, Eduin E. [2 ]
Huang, Yu-Chih [1 ]
Rini, Stefano [1 ]
机构
[1] National Yang-Ming Chiao-Tung University, Institute of Communications Engineering, Hsinchu,30010, Taiwan
[2] National Yang Ming Chiao Tung University, Department of Electronics and Electrical Engineering, Hsinchu,30010, Taiwan
关键词
D O I
10.1109/JIOT.2024.3456857
中图分类号
学科分类号
摘要
In the federated training of a deep neural network (DNN), model updates are transmitted from the remote users to the parameter server (PS). In many scenarios of practical relevance, one is interested in reducing the communication overhead to enhance training efficiency. To address this challenge, we introduce CO3. CO3 takes its name from three processing applied which reduce the communication load when transmitting the local DNN gradients from the remote users to the PS. Namely, 1) gradient quantization through floating-point conversion; 2) lossless compression of the quantized gradient; and 3) correction of quantization error. We carefully design each of the steps above to ensure good training performance under a constraint on the communication rate. In particular, in steps 1) and 2), we adopt the assumption that DNN gradients are distributed according to a generalized normal distribution, which is validated numerically in this article. For step 3), we utilize an error feedback with a memory decay mechanism to correct the quantization error introduced in step 1). We argue that the memory decay coefficient -similar to the learning rate - can be optimally tuned to improve convergence. A rigorous convergence analysis of the proposed CO3 with stochastic gradient descent (SGD) is provided. Moreover, with extensive simulations, we show that CO3 offers improved performance as compared with existing gradient compression schemes proposed in the literature which employ sketching and nonuniform quantization of the local gradients. © 2014 IEEE.
引用
收藏
页码:40431 / 40447
相关论文
共 50 条
  • [21] Federated Learning with Autotuned Communication-Efficient Secure Aggregation
    Bonawitz, Keith
    Salehi, Fariborz
    Konecny, Jakub
    McMahan, Brendan
    Gruteser, Marco
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 1222 - 1226
  • [22] On the Design of Communication-Efficient Federated Learning for Health Monitoring
    Chu, Dong
    Jaafar, Wael
    Yanikomeroglu, Halim
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 1128 - 1133
  • [23] Communication-Efficient Federated Learning For Massive MIMO Systems
    Mu, Yuchen
    Garg, Navneet
    Ratnarajah, Tharmalingam
    2022 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE (WCNC), 2022, : 578 - 583
  • [24] ALS Algorithm for Robust and Communication-Efficient Federated Learning
    Hurley, Neil
    Duriakova, Erika
    Geraci, James
    O'Reilly-Morgan, Diarmuid
    Tragos, Elias
    Smyth, Barry
    Lawlor, Aonghus
    PROCEEDINGS OF THE 2024 4TH WORKSHOP ON MACHINE LEARNING AND SYSTEMS, EUROMLSYS 2024, 2024, : 56 - 64
  • [25] Communication-Efficient Federated Learning via Predictive Coding
    Yue, Kai
    Jin, Richeng
    Wong, Chau-Wai
    Dai, Huaiyu
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2022, 16 (03) : 369 - 380
  • [26] Communication-Efficient Wireless Traffic Prediction with Federated Learning
    Gao, Fuwei
    Zhang, Chuanting
    Qiao, Jingping
    Li, Kaiqiang
    Cao, Yi
    MATHEMATICS, 2024, 12 (16)
  • [27] Communication-Efficient Design for Quantized Decentralized Federated Learning
    Chen, Li
    Liu, Wei
    Chen, Yunfei
    Wang, Weidong
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 1175 - 1188
  • [28] FedHe: Heterogeneous Models and Communication-Efficient Federated Learning
    Chan, Yun Hin
    Ngai, Edith C. H.
    2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 207 - 214
  • [29] Communication-Efficient Robust Federated Learning with Noisy Labels
    Li, Junyi
    Pei, Jian
    Huang, Heng
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 914 - 924
  • [30] FedADP: Communication-Efficient by Model Pruning for Federated Learning
    Liu, Haiyang
    Shi, Yuliang
    Su, Zhiyuan
    Zhang, Kun
    Wang, Xinjun
    Yan, Zhongmin
    Kong, Fanyu
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 3093 - 3098