Communication-efficient Federated Learning via Quantized Clipped SGD

被引:1
|
作者
Jia, Ninghui [1 ]
Qu, Zhihao [1 ]
Ye, Baoliu [2 ]
机构
[1] Hohai Univ, Nanjing, Peoples R China
[2] Nanjing Univ, Nanjing, Peoples R China
基金
国家重点研发计划;
关键词
Federated learning; Gradient quantization; Clipped gradient descent;
D O I
10.1007/978-3-030-85928-2_44
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Communication has been considered as a major bottleneck of Federated Learning (FL) in mobile edge networks since participating workers iteratively transmit gradients to and receive models from the server. Compression technology like quantization that reduces the communication overhead and hyperparameter optimization technology like Clipped Stochastic Gradient Descent (Clipped SGD) that accelerates the convergence are two orthogonal approaches to improve the performance of FL. However, the combination of them has been little studied. To fill this gap, we propose Quantized Clipped SGD (QCSGD) to achieve communication-efficient FL. The major challenge of the combination lies in that the gradient quantization essentially affects the adjusting policy of step size in Clipped SGD, resulting in the lack of convergence guarantee. Therefore, we establish the convergence rate of QCSGD via a thorough theoretical analysis and exhibit that QCSGD has a comparable convergence rate as SGD without compression. We also conduct extensive experiments on various machine learning models and datasets and show that QCSGD outperforms state-of-the-art methods.
引用
收藏
页码:559 / 571
页数:13
相关论文
共 50 条
  • [1] Communication-Efficient Federated Learning via Quantized Compressed Sensing
    Oh, Yongjeong
    Lee, Namyoon
    Jeon, Yo-Seb
    Poor, H. Vincent
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (02) : 1087 - 1100
  • [2] On the Convergence of Communication-Efficient Local SGD for Federated Learning
    Gao, Hongchang
    Xu, An
    Huang, Heng
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7510 - 7518
  • [3] Communication-Efficient Quantized SGD for Learning Polynomial Neural Network
    Yang, Zhanpeng
    Zhou, Yong
    Wu, Youlong
    Shi, Yuanming
    [J]. 2021 IEEE INTERNATIONAL PERFORMANCE, COMPUTING, AND COMMUNICATIONS CONFERENCE (IPCCC), 2021,
  • [4] Communication-Efficient Design for Quantized Decentralized Federated Learning
    Chen, Li
    Liu, Wei
    Chen, Yunfei
    Wang, Weidong
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2024, 72 : 1175 - 1188
  • [5] Lazily Aggregated Quantized Gradient Innovation for Communication-Efficient Federated Learning
    Sun, Jun
    Chen, Tianyi
    Giannakis, Georgios B.
    Yang, Qinmin
    Yang, Zaiyue
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (04) : 2031 - 2044
  • [6] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    [J]. PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [7] Communication-efficient federated learning via knowledge distillation
    Wu, Chuhan
    Wu, Fangzhao
    Lyu, Lingjuan
    Huang, Yongfeng
    Xie, Xing
    [J]. NATURE COMMUNICATIONS, 2022, 13 (01)
  • [8] Communication-Efficient Federated Learning via Predictive Coding
    Yue, Kai
    Jin, Richeng
    Wong, Chau-Wai
    Dai, Huaiyu
    [J]. IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2022, 16 (03) : 369 - 380
  • [9] Communication-efficient federated learning via personalized filter pruning
    Min, Qi
    Luo, Fei
    Dong, Wenbo
    Gu, Chunhua
    Ding, Weichao
    [J]. INFORMATION SCIENCES, 2024, 678
  • [10] Communication-efficient clustered federated learning via model distance
    Zhang, Mao
    Zhang, Tie
    Cheng, Yifei
    Bao, Changcun
    Cao, Haoyu
    Jiang, Deqiang
    Xu, Linli
    [J]. MACHINE LEARNING, 2024, 113 (06) : 3869 - 3888