Adaptive Heterogeneous Client Sampling for Federated Learning Over Wireless Networks

被引:0
|
作者
Luo, Bing [1 ,2 ]
Xiao, Wenli [3 ]
Wang, Shiqiang [4 ]
Huang, Jianwei [5 ]
Tassiulas, Leandros [6 ,7 ]
机构
[1] Duke Kunshan Univ, Data Sci Res Ctr, Kunshan 215316, Peoples R China
[2] Peng Cheng Lab PCL, Shenzhen 518066, Peoples R China
[3] Carnegie Mellon Univ, Inst Robot, Pittsburgh, PA 15213 USA
[4] IBM TJ Watson Res Ctr, Yorktown Hts, NY 10598 USA
[5] Chinese Univ Hong Kong, Shenzhen Inst Artificial Intelligence & Robot Soc, Sch Sci & Engn, Shenzhen 518172, Peoples R China
[6] Yale Univ, Dept Elect Engn, New Haven, CT 06520 USA
[7] Yale Univ, Inst Network Sci, New Haven, CT 06520 USA
基金
中国国家自然科学基金;
关键词
Convergence; Bandwidth; Training; Wireless networks; Prototypes; Optimization; Probability; Client sampling; convergence analysis; federated learning; optimization algorithm; statistical heterogeneity; system heterogeneity; wireless networks; OPTIMIZATION;
D O I
10.1109/TMC.2024.3368473
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Federated learning (FL) algorithms usually sample a fraction of clients in each round (partial participation) when the number of participants is large and the server's communication bandwidth is limited. Recent works on the convergence analysis of FL have focused on unbiased client sampling, e.g., sampling uniformly at random, which suffers from slow wall-clock time for convergence due to high degrees of system heterogeneity (e.g., diverse computation and communication capacities) and statistical heterogeneity (e.g., unbalanced and non-i.i.d. data). This article aims to design an adaptive client sampling algorithm for FL over wireless networks that tackles both system and statistical heterogeneity to minimize the wall-clock convergence time. We obtain a new tractable convergence bound for FL algorithms with arbitrary client sampling probability. Based on the bound, we analytically establish the relationship between the total learning time and sampling probability with an adaptive bandwidth allocation scheme, which results in a non-convex optimization problem. We design an efficient algorithm for learning the unknown parameters in the convergence bound and develop a low-complexity algorithm to approximately solve the non-convex problem. Our solution reveals the impact of system and statistical heterogeneity parameters on the optimal client sampling design. Moreover, our solution shows that as the number of sampled clients increases, the total convergence time first decreases and then increases because a larger sampling number reduces the number of rounds for convergence but results in a longer expected time per-round due to limited wireless bandwidth. Experimental results from both hardware prototype and simulation demonstrate that our proposed sampling scheme significantly reduces the convergence time compared to several baseline sampling schemes. Notably, for EMNIST dataset, our scheme in hardware prototype spends 71% less time than the baseline uniform sampling for reaching the same target loss.
引用
下载
收藏
页码:9663 / 9677
页数:15
相关论文
共 50 条
  • [31] Time-Triggered Federated Learning Over Wireless Networks
    Zhou, Xiaokang
    Deng, Yansha
    Xia, Huiyun
    Wu, Shaochuan
    Bennis, Mehdi
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2022, 21 (12) : 11066 - 11079
  • [32] Accelerating Split Federated Learning Over Wireless Communication Networks
    Xu, Ce
    Li, Jinxuan
    Liu, Yuan
    Ling, Yushi
    Wen, Miaowen
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (06) : 5587 - 5599
  • [33] Performance Optimization of Federated Learning over Mobile Wireless Networks
    Chen, Mingzhe
    Poor, H. Vincent
    Saad, Walid
    Cui, Shuguang
    PROCEEDINGS OF THE 21ST IEEE INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC2020), 2020,
  • [34] DELTA: Diverse Client Sampling for Fasting Federated Learning
    Wang, Lin
    Guo, Yongxin
    Lin, Tao
    Tang, Xiaoying
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [35] Differentially Private Federated Learning With Importance Client Sampling
    Chen, Lin
    Ding, Xiaofeng
    Li, Mengqi
    Jin, Hai
    IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 3635 - 3649
  • [36] An Incentive Auction for Heterogeneous Client Selection in Federated Learning
    Pang, Jinlong
    Yu, Jieling
    Zhou, Ruiting
    Lui, John C. S.
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2023, 22 (10) : 5733 - 5750
  • [37] Federated Learning Under Heterogeneous and Correlated Client Availability
    Rodio, Angelo
    Faticanti, Francescomaria
    Marfoq, Othmane
    Neglia, Giovanni
    Leonardi, Emilio
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (02) : 1451 - 1460
  • [38] Fast Heterogeneous Federated Learning with Hybrid Client Selection
    Song, Duanxiao
    Shen, Guangyuan
    Gao, Dehong
    Yang, Libin
    Zhou, Xukai
    Pan, Shirui
    Lou, Wei
    Zhou, Fang
    UNCERTAINTY IN ARTIFICIAL INTELLIGENCE, 2023, 216 : 2006 - 2015
  • [39] FedACS: an Efficient Federated Learning Method Among Multiple Medical Institutions with Adaptive Client Sampling
    Gu, Yunchao
    Hu, Quanquan
    Wang, Xinliang
    Zhou, Zhong
    Lu, Sixu
    2021 14TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING, BIOMEDICAL ENGINEERING AND INFORMATICS (CISP-BMEI 2021), 2021,
  • [40] Asynchronous Wireless Federated Learning With Probabilistic Client Selection
    Yang, Jiarong
    Liu, Yuan
    Chen, Fangjiong
    Chen, Wen
    Li, Changle
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2024, 23 (07) : 7144 - 7158