Communication-Efficient Federated Learning via Predictive Coding

被引:7
|
作者
Yue, Kai [1 ]
Jin, Richeng [1 ]
Wong, Chau-Wai [1 ]
Dai, Huaiyu [1 ]
机构
[1] NC State Univ, Dept Elect & Comp Engn, Raleigh, NC 27695 USA
基金
美国国家科学基金会;
关键词
Predictive models; Servers; Collaborative work; Predictive coding; Entropy coding; Costs; Quantization (signal); Federated learning; distributed optimization; predictive coding;
D O I
10.1109/JSTSP.2022.3142678
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning can enable remote workers to collaboratively train a shared machine learning model while allowing training data to be kept locally. In the use case of wireless mobile devices, the communication overhead is a critical bottleneck due to limited power and bandwidth. Prior work has utilized various data compression tools such as quantization and sparsification to reduce the overhead. In this paper, we propose a predictive coding based compression scheme for federated learning. The scheme has shared prediction functions among all devices and allows each worker to transmit a compressed residual vector derived from the reference. In each communication round, we select the predictor and quantizer based on the rate-distortion cost, and further reduce the redundancy with entropy coding. Extensive simulations reveal that the communication cost can be reduced up to 99% with even better learning performance when compared with other baseline methods.
引用
收藏
页码:369 / 380
页数:12
相关论文
共 50 条
  • [1] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [2] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    [J]. NATURE COMMUNICATIONS, 2022, 13 (01)
  • [3] Communication-efficient federated learning via personalized filter pruning
    Min, Qi
    Luo, Fei
    Dong, Wenbo
    Gu, Chunhua
    Ding, Weichao
    [J]. INFORMATION SCIENCES, 2024, 678
  • [4] Communication-efficient clustered federated learning via model distance
    Zhang, Mao
    Zhang, Tie
    Cheng, Yifei
    Bao, Changcun
    Cao, Haoyu
    Jiang, Deqiang
    Xu, Linli
    [J]. MACHINE LEARNING, 2024, 113 (06) : 3869 - 3888
  • [5] PBFL: Communication-Efficient Federated Learning via Parameter Predicting
    Li, Kaiju
    Xiao, Chunhua
    [J]. COMPUTER JOURNAL, 2023, 66 (03): : 626 - 642
  • [6] Communication-efficient Federated Learning via Quantized Clipped SGD
    Jia, Ninghui
    Qu, Zhihao
    Ye, Baoliu
    [J]. WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2021, PT I, 2021, 12937 : 559 - 571
  • [7] Communication-Efficient Vertical Federated Learning
    Khan, Afsana
    ten Thij, Marijn
    Wilbik, Anna
    [J]. ALGORITHMS, 2022, 15 (08)
  • [8] FedCO: Communication-Efficient Federated Learning via Clustering Optimization
    Al-Saedi, Ahmed A.
    Boeva, Veselka
    Casalicchio, Emiliano
    [J]. FUTURE INTERNET, 2022, 14 (12)
  • [9] Communication-Efficient Federated Learning via Quantized Compressed Sensing
    Oh, Yongjeong
    Lee, Namyoon
    Jeon, Yo-Seb
    Poor, H. Vincent
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (02) : 1087 - 1100
  • [10] Communication-efficient clustered federated learning via model distance
    Mao Zhang
    Tie Zhang
    Yifei Cheng
    Changcun Bao
    Haoyu Cao
    Deqiang Jiang
    Linli Xu
    [J]. Machine Learning, 2024, 113 : 3869 - 3888