Efficient Client Sampling with Compression in Heterogeneous Federated Learning

被引:0
|
作者
Marnissi, Ouiame [1 ]
El Hammouti, Hajar [1 ]
Bergou, El Houcine [1 ]
机构
[1] Mohammed VI Polytech Univ UM6P, Coll Comp, Ben Guerir, Morocco
关键词
Federated learning; Client sampling; Heterogeneity; Resource allocation;
D O I
10.1109/INFOCOMWKSHPS61880.2024.10620859
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
Federated Learning (FL) has emerged as a promising decentralized machine learning (ML) paradigm where distributed clients collaboratively train models without sharing their private data. However, due to their limited resources and heterogeneous properties, only a small subset of clients can participate at a given time. Furthermore, the high dimensions of ML models incur a massive communication overhead which considerably slows down the convergence of FL. To address the aforementioned challenges, we propose FedHSC, a framework that considers both system and statistical heterogeneity. Specifically, at each communication round, the clients are sampled based on their data properties combined with the importance of their local learning update. After completing their local training, the selected clients share compressed updates with the server for aggregation. The compression rate is adjusted for each client to meet the communication delay requirement. Experimental results on CIFAR-10 show the efficiency of our approach and its robustness to Non-IID data.
引用
收藏
页数:2
相关论文
共 50 条
  • [41] Mitigating bias in heterogeneous federated learning via stratified client selection
    Liu, Yazhi
    Xia, Haonan
    Li, Wei
    Niu, Teng
    PEER-TO-PEER NETWORKING AND APPLICATIONS, 2025, 18 (02)
  • [42] Federated learning with workload-aware client scheduling in heterogeneous systems
    Li, Li
    Liu, Duo
    Duan, Moming
    Zhang, Yu
    Ren, Ao
    Chen, Xianzhang
    Tan, Yujuan
    Wang, Chengliang
    NEURAL NETWORKS, 2022, 154 : 560 - 573
  • [43] Incentive Design for Heterogeneous Client Selection: A Robust Federated Learning Approach
    Pene, Papa
    Liao, Weixian
    Yu, Wei
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (04): : 5939 - 5950
  • [44] FedSAR for Heterogeneous Federated learning:A Client Selection Algorithm Based on SARSA
    Chen, Dufeng
    Jing, Rui
    Wu, Jiaqi
    Wang, Zehua
    Tian, Zijian
    Zhang, Fan
    Chen, Wei
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT I, ICIC 2024, 2024, 14862 : 219 - 229
  • [45] Data Quality-Aware Client Selection in Heterogeneous Federated Learning
    Song, Shinan
    Li, Yaxin
    Wan, Jin
    Fu, Xianghua
    Jiang, Jingyan
    MATHEMATICS, 2024, 12 (20)
  • [46] Communication-Efficient Federated Learning With Data and Client Heterogeneity
    Zakerinia, Hossein
    Talaei, Shayan
    Nadiradze, Giorgi
    Alistarh, Dan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [47] Efficient federated learning with cross-resource client collaboration
    Shen, Qi
    Yang, Liu
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2025, 16 (02) : 931 - 945
  • [48] Auxo: Efficient Federated Learning via Scalable Client Clustering
    Liu, Jiachen
    Lai, Fan
    Dai, Yinwei
    Akella, Aditya
    Madhyastha, Harsha V.
    Chowdhury, Mosharaf
    PROCEEDINGS OF THE 2023 ACM SYMPOSIUM ON CLOUD COMPUTING, SOCC 2023, 2023, : 125 - 141
  • [49] Device Sampling for Heterogeneous Federated Learning: Theory, Algorithms, and Implementation
    Wang, Su
    Lee, Mengyuan
    Hosseinalipour, Seyyedali
    Morabito, Roberto
    Chiang, Mung
    Brinton, Christopher G.
    IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2021), 2021,
  • [50] Intrinsic Gradient Compression for Scalable and Efficient Federated Learning
    Melas-Kyriazi, Luke
    Wang, Franklyn
    PROCEEDINGS OF THE FIRST WORKSHOP ON FEDERATED LEARNING FOR NATURAL LANGUAGE PROCESSING (FL4NLP 2022), 2022, : 27 - 41