Communication-Efficient Nonconvex Federated Learning With Error Feedback for Uplink and Downlink

被引:0
|
作者
Zhou, Xingcai [1 ]
Chang, Le [1 ]
Cao, Jinde [2 ,3 ]
机构
[1] Nanjing Audit Univ, Sch Stat & Data Sci, Nanjing 211815, Peoples R China
[2] Southeast Univ, Sch Math, Nanjing 210096, Peoples R China
[3] Yonsei Univ, Yonsei Frontier Lab, Seoul 03722, South Korea
基金
中国国家自然科学基金;
关键词
Communication-efficient; distributed learning; error feedback (EF); lazily aggregated gradient (LAG); nonconvex learning; uplink and downlink;
D O I
10.1109/TNNLS.2023.3333804
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Facing large-scale online learning, the reliance on sophisticated model architectures often leads to nonconvex distributed optimization, which is more challenging than convex problems. Online recruited workers, such as mobile phone, laptop, and desktop computers, often have narrower uplink bandwidths than downlink. In this article, we propose two communication-efficient nonconvex federated learning algorithms with error feedback 2021 (EF21) and lazily aggregated gradient (LAG) for adapting uplink and downlink communications. EF21 is a new and theoretically better EF, which consistently and substantially outperforms vanilla EF in practice. LAG is a gradient filtration technique for adapting communication. For reducing communication costs of uplink, we design an effective LAG rule and then give EF21 with LAG (EF-LAG) algorithm, which combines EF21 and our LAG rule. We also present a bidirectional EF-LAG (BiEF-LAG) algorithm for reducing uplink and downlink communication costs. Theoretically, our proposed algorithms enjoy the same fast convergence rate $\mathcal{O}(1/T)$ as gradient descent (GD) for smooth nonconvex learning. That is, our algorithms greatly reduce communication costs without sacrificing the quality of learning. Numerical experiments on both synthetic data and deep learning benchmarks show significant empirical superiority of our algorithms in communication.
引用
收藏
页码:1 / 12
页数:12
相关论文
共 50 条
  • [1] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [2] Communication-Efficient Model Aggregation With Layer Divergence Feedback in Federated Learning
    Wang, Liwei
    Li, Jun
    Chen, Wen
    Wu, Qingqing
    Ding, Ming
    IEEE Communications Letters, 2024, 28 (10) : 2293 - 2297
  • [3] Communication-Efficient Vertical Federated Learning
    Khan, Afsana
    ten Thij, Marijn
    Wilbik, Anna
    ALGORITHMS, 2022, 15 (08)
  • [4] Communication-Efficient Adaptive Federated Learning
    Wang, Yujia
    Lin, Lu
    Chen, Jinghui
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [5] Communication-Efficient Personalized Federated Edge Learning for Massive MIMO CSI Feedback
    Cui, Yiming
    Guo, Jiajia
    Wen, Chao-Kai
    Jin, Shi
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (07) : 7362 - 7375
  • [6] Communication-Efficient Federated Learning with Heterogeneous Devices
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3602 - 3607
  • [7] Communication-Efficient Federated Learning for Decision Trees
    Zhao, Shuo
    Zhu, Zikun
    Li, Xin
    Chen, Ying-Chi
    IEEE Transactions on Artificial Intelligence, 2024, 5 (11): : 5478 - 5492
  • [8] How Robust is Federated Learning to Communication Error? A Comparison Study Between Uplink and Downlink Channels
    Qu, Linping
    Song, Shenghui
    Tsui, Chi-Ying
    Mao, Yuyi
    2024 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC 2024, 2024,
  • [9] Communication-Efficient Federated Learning with Adaptive Quantization
    Mao, Yuzhu
    Zhao, Zihao
    Yan, Guangfeng
    Liu, Yang
    Lan, Tian
    Song, Linqi
    Ding, Wenbo
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
  • [10] Communication-Efficient Secure Aggregation for Federated Learning
    Ergun, Irem
    Sami, Hasin Us
    Guler, Basak
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 3881 - 3886