Communication-efficient Federated Learning via Quantized Clipped SGD

被引:1
|
作者
Jia, Ninghui [1 ]
Qu, Zhihao [1 ]
Ye, Baoliu [2 ]
机构
[1] Hohai Univ, Nanjing, Peoples R China
[2] Nanjing Univ, Nanjing, Peoples R China
基金
国家重点研发计划;
关键词
Federated learning; Gradient quantization; Clipped gradient descent;
D O I
10.1007/978-3-030-85928-2_44
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Communication has been considered as a major bottleneck of Federated Learning (FL) in mobile edge networks since participating workers iteratively transmit gradients to and receive models from the server. Compression technology like quantization that reduces the communication overhead and hyperparameter optimization technology like Clipped Stochastic Gradient Descent (Clipped SGD) that accelerates the convergence are two orthogonal approaches to improve the performance of FL. However, the combination of them has been little studied. To fill this gap, we propose Quantized Clipped SGD (QCSGD) to achieve communication-efficient FL. The major challenge of the combination lies in that the gradient quantization essentially affects the adjusting policy of step size in Clipped SGD, resulting in the lack of convergence guarantee. Therefore, we establish the convergence rate of QCSGD via a thorough theoretical analysis and exhibit that QCSGD has a comparable convergence rate as SGD without compression. We also conduct extensive experiments on various machine learning models and datasets and show that QCSGD outperforms state-of-the-art methods.
引用
收藏
页码:559 / 571
页数:13
相关论文
共 50 条
  • [41] Communication-Efficient Robust Federated Learning with Noisy Labels
    Li, Junyi
    Pei, Jian
    Huang, Heng
    [J]. PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 914 - 924
  • [42] FedADP: Communication-Efficient by Model Pruning for Federated Learning
    Liu, Haiyang
    Shi, Yuliang
    Su, Zhiyuan
    Zhang, Kun
    Wang, Xinjun
    Yan, Zhongmin
    Kong, Fanyu
    [J]. IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 3093 - 3098
  • [43] FedHe: Heterogeneous Models and Communication-Efficient Federated Learning
    Chan, Yun Hin
    Ngai, Edith C. H.
    [J]. 2021 17TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING (MSN 2021), 2021, : 207 - 214
  • [44] Communication-Efficient Consensus Mechanism for Federated Reinforcement Learning
    Xu, Xing
    Li, Rongpeng
    Zhao, Zhifeng
    Zhang, Honggang
    [J]. IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 80 - 85
  • [45] Communication-Efficient Federated Learning With Binary Neural Networks
    Yang, Yuzhi
    Zhang, Zhaoyang
    Yang, Qianqian
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3836 - 3850
  • [46] Communication-efficient Federated Learning with Cooperative Filter Selection
    Yang, Zhao
    Sun, Qingshuang
    [J]. 2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2172 - 2176
  • [47] Communication-Efficient Federated Learning With Gradual Layer Freezing
    Malan, Erich
    Peluso, Valentino
    Calimera, Andrea
    Macii, Enrico
    [J]. IEEE EMBEDDED SYSTEMS LETTERS, 2023, 15 (01) : 25 - 28
  • [48] Communication-Efficient Federated Learning with Adaptive Consensus ADMM
    He, Siyi
    Zheng, Jiali
    Feng, Minyu
    Chen, Yixin
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (09):
  • [49] Communication-Efficient Federated Learning Based on Compressed Sensing
    Li, Chengxi
    Li, Gang
    Varshney, Pramod K.
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2021, 8 (20) : 15531 - 15541
  • [50] A Cooperative Analysis to Incentivize Communication-Efficient Federated Learning
    Li, Youqi
    Li, Fan
    Yang, Song
    Zhang, Chuan
    Zhu, Liehuang
    Wang, Yu
    [J]. IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 10175 - 10190