Communication-Efficient Design for Quantized Decentralized Federated Learning

被引:7
|
作者
Chen, Li [1 ]
Liu, Wei [1 ]
Chen, Yunfei [2 ]
Wang, Weidong [1 ]
机构
[1] Univ Sci & Technol China, Dept Elect Engn & Informat Sci, Hefei 230027, Peoples R China
[2] Univ Durham, Dept Engn, Durham DH1 3LE, England
基金
中国国家自然科学基金;
关键词
Decentralized federated learning; doubly-adaptive quantization; Lloyd-Max quantizer; GRADIENT DESCENT;
D O I
10.1109/TSP.2024.3363887
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Decentralized federated learning (DFL) is a variant of federated learning, where edge nodes only communicate with their one-hop neighbors to learn the optimal model. However, as information exchange is restricted in a range of one-hop in DFL, inefficient information exchange leads to more communication rounds to reach the targeted training loss. This greatly reduces the communication efficiency. In this paper, we propose a new non-uniform quantization of model parameters to improve DFL convergence. Specifically, we apply the Lloyd-Max algorithm to DFL (LM-DFL) first to minimize the quantization distortion by adjusting the quantization levels adaptively. Convergence guarantee of LM-DFL is established without convex loss assumption. Based on LM-DFL, we then propose a new doubly-adaptive DFL, which jointly considers the ascending number of quantization levels to reduce the amount of communicated information in the training and adapts the quantization levels for non-uniform gradient distributions. Experiment results based on MNIST and CIFAR-10 datasets illustrate the superiority of LM-DFL with the optimal quantized distortion and show that doubly-adaptive DFL can greatly improve communication efficiency.
引用
收藏
页码:1175 / 1188
页数:14
相关论文
共 50 条
  • [31] FedADP: Communication-Efficient by Model Pruning for Federated Learning
    Liu, Haiyang
    Shi, Yuliang
    Su, Zhiyuan
    Zhang, Kun
    Wang, Xinjun
    Yan, Zhongmin
    Kong, Fanyu
    IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 3093 - 3098
  • [32] Communication-Efficient Robust Federated Learning with Noisy Labels
    Li, Junyi
    Pei, Jian
    Huang, Heng
    PROCEEDINGS OF THE 28TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY AND DATA MINING, KDD 2022, 2022, : 914 - 924
  • [33] Communication-Efficient Federated Learning via Predictive Coding
    Yue, Kai
    Jin, Richeng
    Wong, Chau-Wai
    Dai, Huaiyu
    IEEE JOURNAL OF SELECTED TOPICS IN SIGNAL PROCESSING, 2022, 16 (03) : 369 - 380
  • [34] Communication-Efficient Federated Learning With Data and Client Heterogeneity
    Zakerinia, Hossein
    Talaei, Shayan
    Nadiradze, Giorgi
    Alistarh, Dan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [35] Communication-Efficient Wireless Traffic Prediction with Federated Learning
    Gao, Fuwei
    Zhang, Chuanting
    Qiao, Jingping
    Li, Kaiqiang
    Cao, Yi
    MATHEMATICS, 2024, 12 (16)
  • [36] Communication-Efficient Consensus Mechanism for Federated Reinforcement Learning
    Xu, Xing
    Li, Rongpeng
    Zhao, Zhifeng
    Zhang, Honggang
    IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC 2022), 2022, : 80 - 85
  • [37] Communication-efficient Federated Learning with Cooperative Filter Selection
    Yang, Zhao
    Sun, Qingshuang
    2022 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS 22), 2022, : 2172 - 2176
  • [38] Communication-Efficient Federated Learning With Binary Neural Networks
    Yang, Yuzhi
    Zhang, Zhaoyang
    Yang, Qianqian
    IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2021, 39 (12) : 3836 - 3850
  • [39] Communication-Efficient Federated Learning with Adaptive Consensus ADMM
    He, Siyi
    Zheng, Jiali
    Feng, Minyu
    Chen, Yixin
    APPLIED SCIENCES-BASEL, 2023, 13 (09):
  • [40] Communication-Efficient Federated Learning With Gradual Layer Freezing
    Malan, Erich
    Peluso, Valentino
    Calimera, Andrea
    Macii, Enrico
    IEEE EMBEDDED SYSTEMS LETTERS, 2023, 15 (01) : 25 - 28