Communication-Efficient Nonconvex Federated Learning With Error Feedback for Uplink and Downlink

被引:0
|
作者
Zhou, Xingcai [1 ]
Chang, Le [1 ]
Cao, Jinde [2 ,3 ]
机构
[1] Nanjing Audit Univ, Sch Stat & Data Sci, Nanjing 211815, Peoples R China
[2] Southeast Univ, Sch Math, Nanjing 210096, Peoples R China
[3] Yonsei Univ, Yonsei Frontier Lab, Seoul 03722, South Korea
基金
中国国家自然科学基金;
关键词
Uplink; Downlink; Costs; Convergence; Federated learning; Servers; Markov processes; Communication-efficient; distributed learning; error feedback (EF); lazily aggregated gradient (LAG); nonconvex learning; uplink and downlink;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Facing large-scale online learning, the reliance on sophisticated model architectures often leads to nonconvex distributed optimization, which is more challenging than convex problems. Online recruited workers, such as mobile phone, laptop, and desktop computers, often have narrower uplink bandwidths than downlink. In this article, we propose two communication-efficient nonconvex federated learning algorithms with error feedback 2021 (EF21) and lazily aggregated gradient (LAG) for adapting uplink and downlink communications. EF21 is a new and theoretically better EF, which consistently and substantially outperforms vanilla EF in practice. LAG is a gradient filtration technique for adapting communication. For reducing communication costs of uplink, we design an effective LAG rule and then give EF21 with LAG (EF-LAG) algorithm, which combines EF21 and our LAG rule. We also present a bidirectional EF-LAG (BiEF-LAG) algorithm for reducing uplink and downlink communication costs. Theoretically, our proposed algorithms enjoy the same fast convergence rate O(1/ T) as gradient descent (GD) for smooth nonconvex learning. That is, our algorithms greatly reduce communication costs without sacrificing the quality of learning. Numerical experiments on both synthetic data and deep learning benchmarks show significant empirical superiority of our algorithms in communication.
引用
收藏
页码:1003 / 1014
页数:12
相关论文
共 50 条
  • [41] Communication-efficient and Scalable Decentralized Federated Edge Learning
    Yapp, Austine Zong Han
    Koh, Hong Soo Nicholas
    Lai, Yan Ting
    Kang, Jiawen
    Li, Xuandi
    Ng, Jer Shyuan
    Jiang, Hongchao
    Lim, Wei Yang Bryan
    Xiong, Zehui
    Niyato, Dusit
    PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 5032 - 5035
  • [42] Communication-Efficient Federated Learning with Adaptive Parameter Freezing
    Chen, Chen
    Xu, Hong
    Wang, Wei
    Li, Baochun
    Li, Bo
    Chen, Li
    Zhang, Gong
    2021 IEEE 41ST INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS (ICDCS 2021), 2021, : 1 - 11
  • [43] FedCS: Communication-Efficient Federated Learning with Compressive Sensing
    Liu, Ye
    Chang, Shan
    Liu, Yiqi
    2022 IEEE 28TH INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS, ICPADS, 2022, : 17 - 24
  • [44] Communication-Efficient and Personalized Federated Lottery Ticket Learning
    Seo, Sejin
    Ko, Seung-Woo
    Park, Jihong
    Kim, Seong-Lyun
    Bennis, Mehdi
    SPAWC 2021: 2021 IEEE 22ND INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC 2021), 2020, : 581 - 585
  • [45] Communication-efficient federated learning via personalized filter pruning
    Min, Qi
    Luo, Fei
    Dong, Wenbo
    Gu, Chunhua
    Ding, Weichao
    INFORMATION SCIENCES, 2024, 678
  • [46] Communication-Efficient Personalized Federated Learning for Green Communications in IoMT
    Chen, Ziqi
    Du, Jun
    Jiang, Chunxiao
    Lu, Yunlong
    Han, Zhu
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 1521 - 1526
  • [47] Dynamic Sampling and Selective Masking for Communication-Efficient Federated Learning
    Ji, Shaoxiong
    Jiang, Wenqi
    Walid, Anwar
    Li, Xue
    IEEE INTELLIGENT SYSTEMS, 2022, 37 (02) : 27 - 34
  • [48] SPinS-FL: Communication-Efficient Federated Subnetwork Learning
    Tsutsui, Masayoshi
    Takamaeda-Yamazaki, Shinya
    2023 IEEE 20TH CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE, CCNC, 2023,
  • [49] Communication-efficient clustered federated learning via model distance
    Zhang, Mao
    Zhang, Tie
    Cheng, Yifei
    Bao, Changcun
    Cao, Haoyu
    Jiang, Deqiang
    Xu, Linli
    MACHINE LEARNING, 2024, 113 (06) : 3869 - 3888
  • [50] Time-Correlated Sparsification for Communication-Efficient Federated Learning
    Ozfatura, Emre
    Ozfatura, Kerem
    Gunduz, Deniz
    2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 461 - 466