Efficient Client Sampling with Compression in Heterogeneous Federated Learning

被引:0
|
作者
Marnissi, Ouiame [1 ]
El Hammouti, Hajar [1 ]
Bergou, El Houcine [1 ]
机构
[1] Mohammed VI Polytech Univ UM6P, Coll Comp, Ben Guerir, Morocco
关键词
Federated learning; Client sampling; Heterogeneity; Resource allocation;
D O I
10.1109/INFOCOMWKSHPS61880.2024.10620859
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL) has emerged as a promising decentralized machine learning (ML) paradigm where distributed clients collaboratively train models without sharing their private data. However, due to their limited resources and heterogeneous properties, only a small subset of clients can participate at a given time. Furthermore, the high dimensions of ML models incur a massive communication overhead which considerably slows down the convergence of FL. To address the aforementioned challenges, we propose FedHSC, a framework that considers both system and statistical heterogeneity. Specifically, at each communication round, the clients are sampled based on their data properties combined with the importance of their local learning update. After completing their local training, the selected clients share compressed updates with the server for aggregation. The compression rate is adjusted for each client to meet the communication delay requirement. Experimental results on CIFAR-10 show the efficiency of our approach and its robustness to Non-IID data.
引用
收藏
页数:2
相关论文
共 50 条
  • [1] Optimal Mechanism Design for Heterogeneous Client Sampling in Federated Learning
    Liao, Guocheng
    Luo, Bing
    Feng, Yutong
    Zhang, Meng
    Chen, Xu
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (11) : 10598 - 10609
  • [2] Energy-Efficient Client Sampling for Federated Learning in Heterogeneous Mobile Edge Computing Networks
    Tang, Jian
    Li, Xiuhua
    Li, Hui
    Xiong, Min
    Wang, Xiaofei
    Leung, Victor C. M.
    ICC 2024 - IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2024, : 956 - 961
  • [3] Reducing communication in federated learning via efficient client sampling
    Ribero, Monica
    Vikalo, Haris
    PATTERN RECOGNITION, 2024, 148
  • [4] Communication Efficient Federated Learning With Heterogeneous Structured Client Models
    Hu, Yao
    Sun, Xiaoyan
    Tian, Ye
    Song, Linqi
    Tan, Kay Chen
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2023, 7 (03): : 753 - 767
  • [5] Adaptive Heterogeneous Client Sampling for Federated Learning Over Wireless Networks
    Luo, Bing
    Xiao, Wenli
    Wang, Shiqiang
    Huang, Jianwei
    Tassiulas, Leandros
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 9663 - 9677
  • [6] Federated Learning With Client Selection and Gradient Compression in Heterogeneous Edge Systems
    Xu, Yang
    Jiang, Zhida
    Xu, Hongli
    Wang, Zhiyuan
    Qian, Chen
    Qiao, Chunming
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (05) : 5446 - 5461
  • [7] Dual Adaptive Compression for Efficient Communication in Heterogeneous Federated Learning
    Mao, Yingchi
    Wang, Zibo
    Li, Chenxin
    Zhang, Jiakai
    Xu, Shufang
    Wu, Jie
    2024 IEEE 24TH INTERNATIONAL SYMPOSIUM ON CLUSTER, CLOUD AND INTERNET COMPUTING, CCGRID 2024, 2024, : 236 - 244
  • [8] iSample: Intelligent Client Sampling in Federated Learning
    Imani, HamidReza
    Anderson, Jeff
    El-Ghazawi, Tarek
    6TH IEEE INTERNATIONAL CONFERENCE ON FOG AND EDGE COMPUTING (ICFEC 2022), 2022, : 58 - 65
  • [9] A General Theory for Client Sampling in Federated Learning
    Fraboni, Yann
    Vidal, Richard
    Kameni, Laetitia
    Lorenzi, Marco
    TRUSTWORTHY FEDERATED LEARNING, FL 2022, 2023, 13448 : 46 - 58
  • [10] Energy-efficient client selection in federated learning with heterogeneous data on edge
    Jianxin Zhao
    Yanhao Feng
    Xinyu Chang
    Chi Harold Liu
    Peer-to-Peer Networking and Applications, 2022, 15 : 1139 - 1151