Tackling System and Statistical Heterogeneity for Federated Learning with Adaptive Client Sampling

被引:88
|
作者
Luo, Bing [1 ,2 ,4 ,5 ]
Xiao, Wenli [1 ,2 ]
Wang, Shiqiang [3 ]
Huang, Jianwei [1 ,2 ]
Tassiulas, Leandros [4 ,5 ]
机构
[1] Shenzhen Inst Artificial Intelligence & Robot Soc, Shenzhen, Peoples R China
[2] Chinese Univ Hong Kong, Sch Sci & Engn, Shenzhen, Peoples R China
[3] IBM Corp, TJ Watson Res Ctr, Yorktown Hts, NY USA
[4] Yale Univ, Dept Elect Engn, New Haven, CT 06520 USA
[5] Yale Univ, Inst Network Sci, New Haven, CT 06520 USA
关键词
OPTIMIZATION;
D O I
10.1109/INFOCOM48880.2022.9796935
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) algorithms usually sample a fraction of clients in each round (partial participation) when the number of participants is large and the server's communication bandwidth is limited. Recent works on the convergence analysis of FL have focused on unbiased client sampling, e.g., sampling uniformly at random, which suffers from slow wall-clock time for convergence due to high degrees of system heterogeneity and statistical heterogeneity. This paper aims to design an adaptive client sampling algorithm that tackles both system and statistical heterogeneity to minimize the wall-clock convergence time. We obtain a new tractable convergence bound for FL algorithms with arbitrary client sampling probabilities. Based on the bound, we analytically establish the relationship between the total learning time and sampling probabilities, which results in a non-convex optimization problem for training time minimization. We design an efficient algorithm for learning the unknown parameters in the convergence bound and develop a low-complexity algorithm to approximately solve the non-convex problem. Experimental results from both hardware prototype and simulation demonstrate that our proposed sampling scheme significantly reduces the convergence time compared to several baseline sampling schemes. Notably, our scheme in hardware prototype spends 73% less time than the uniform sampling baseline for reaching the same target loss.
引用
收藏
页码:1739 / 1748
页数:10
相关论文
共 50 条
  • [1] Linear Convergence in Federated Learning: Tackling Client Heterogeneity and Sparse Gradients
    Mitra, Aritra
    Jaafar, Rayana
    Pappas, George J.
    Hassani, Hamed
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [2] Adaptive Client Sampling in Federated Learning via Online Learning with Bandit Feedback
    Zhao, Boxin
    Wang, Lingxiao
    Liu, Ziqi
    Zhang, Zhiqiang
    Zhou, Jun
    Chen, Chaochao
    Kolar, Mladen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2025, 26 : 1 - 67
  • [3] RingSFL: An Adaptive Split Federated Learning Towards Taming Client Heterogeneity
    Shen, Jinglong
    Cheng, Nan
    Wang, Xiucheng
    Lyu, Feng
    Xu, Wenchao
    Liu, Zhi
    Aldubaikhy, Khalid
    Shen, Xuemin
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (05) : 5462 - 5478
  • [4] Depersonalized Federated Learning: Tackling Statistical Heterogeneity by Alternating Stochastic Gradient Descent
    Zhou, Yujie
    Li, Zhidu
    Tang, Tong
    Wang, Ruyan
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1988 - 1993
  • [5] Adaptive Heterogeneous Client Sampling for Federated Learning Over Wireless Networks
    Luo, Bing
    Xiao, Wenli
    Wang, Shiqiang
    Huang, Jianwei
    Tassiulas, Leandros
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 9663 - 9677
  • [6] FedClust: Tackling Data Heterogeneity in Federated Learning through Weight-Driven Client Clustering
    Islam, Md Sirajul
    Javaherian, Simin
    Xu, Fei
    Yuan, Xu
    Chen, Li
    Tzeng, Nian-Feng
    53RD INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2024, 2024, : 474 - 483
  • [7] Tackling data-heterogeneity variations in federated learning via adaptive aggregate weights
    Yin, Qiaoyun
    Feng, Zhiyong
    Li, Xiaohong
    Chen, Shizhan
    Wu, Hongyue
    Han, Gaoyong
    KNOWLEDGE-BASED SYSTEMS, 2024, 304
  • [8] iSample: Intelligent Client Sampling in Federated Learning
    Imani, HamidReza
    Anderson, Jeff
    El-Ghazawi, Tarek
    6TH IEEE INTERNATIONAL CONFERENCE ON FOG AND EDGE COMPUTING (ICFEC 2022), 2022, : 58 - 65
  • [9] A General Theory for Client Sampling in Federated Learning
    Fraboni, Yann
    Vidal, Richard
    Kameni, Laetitia
    Lorenzi, Marco
    TRUSTWORTHY FEDERATED LEARNING, FL 2022, 2023, 13448 : 46 - 58
  • [10] Mobility-aware Device Sampling for Statistical Heterogeneity in Hierarchical Federated Learning
    Zhang, Songli
    Zheng, Zhenzhe
    Li, Qinya
    Wu, Fan
    Chen, Guihai
    2024 IEEE 44TH INTERNATIONAL CONFERENCE ON DISTRIBUTED COMPUTING SYSTEMS, ICDCS 2024, 2024, : 656 - 667