Reducing communication in federated learning via efficient client sampling

被引:10
|
作者
Ribero, Monica [1 ]
Vikalo, Haris [1 ]
机构
[1] Univ Texas Austin, Dept Elect & Comp Engn, Austin, TX 78712 USA
基金
美国国家科学基金会;
关键词
Federated learning; Machine learning; Distributed optimization;
D O I
10.1016/j.patcog.2023.110122
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) ameliorates privacy concerns in settings where a central server coordinates learning from data distributed across many clients; rather than sharing the data, the clients train locally and report the models they learn to the server. Aggregation of local models requires communicating massive amounts of information between the clients and the server, consuming network bandwidth. We propose a novel framework for updating the global model in communication-constrained FL systems by requesting input only from the clients with informative updates, and estimating the local updates that are not communicated. Specifically, describing the progression of the model's weights by an Ornstein-Uhlenbeck process allows us to develop sampling strategy for selecting a subset of clients with significant weight updates; model updates of the clients not selected for communication are replaced by their estimates. We test this policy on realistic federated benchmark datasets and show that the proposed framework provides up to 50% reduction in communication while maintaining competitive or achieving superior performance compared to baselines. The proposed method represents a new line of strategies for communication-efficient FL that is orthogonal to the existing user-driven techniques, such as compression, thus complementing rather than aiming to replace those existing methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [31] PBFL: Communication-Efficient Federated Learning via Parameter Predicting
    Li, Kaiju
    Xiao, Chunhua
    COMPUTER JOURNAL, 2023, 66 (03): : 626 - 642
  • [32] Communication-efficient clustered federated learning via model distance
    Zhang, Mao
    Zhang, Tie
    Cheng, Yifei
    Bao, Changcun
    Cao, Haoyu
    Jiang, Deqiang
    Xu, Linli
    MACHINE LEARNING, 2024, 113 (06) : 3869 - 3888
  • [33] Communication-efficient Federated Learning via Quantized Clipped SGD
    Jia, Ninghui
    Qu, Zhihao
    Ye, Baoliu
    WIRELESS ALGORITHMS, SYSTEMS, AND APPLICATIONS, WASA 2021, PT I, 2021, 12937 : 559 - 571
  • [34] Communication-Efficient Federated Learning via Quantized Compressed Sensing
    Oh, Yongjeong
    Lee, Namyoon
    Jeon, Yo-Seb
    Poor, H. Vincent
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2023, 22 (02) : 1087 - 1100
  • [35] Communication-efficient clustered federated learning via model distance
    Mao Zhang
    Tie Zhang
    Yifei Cheng
    Changcun Bao
    Haoyu Cao
    Deqiang Jiang
    Linli Xu
    Machine Learning, 2024, 113 : 3869 - 3888
  • [36] FedCO: Communication-Efficient Federated Learning via Clustering Optimization
    Al-Saedi, Ahmed A.
    Boeva, Veselka
    Casalicchio, Emiliano
    FUTURE INTERNET, 2022, 14 (12)
  • [37] EFFICIENT CLIENT CONTRIBUTION EVALUATION FOR HORIZONTAL FEDERATED LEARNING
    Zhao, Jie
    Zhu, Xinghua
    Wang, Jianzong
    Xiao, Jing
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 3060 - 3064
  • [38] RFCSC: Communication efficient reinforcement federated learning with dynamic client selection and adaptive gradient compression
    Pan, Zhenhui
    Li, Yawen
    Guan, Zeli
    Liang, Meiyu
    Li, Ang
    Wang, Jia
    Kou, Feifei
    NEUROCOMPUTING, 2025, 612
  • [39] Communication-efficient federated learning
    Chen, Mingzhe
    Shlezinger, Nir
    Poor, H. Vincent
    Eldar, Yonina C.
    Cui, Shuguang
    PROCEEDINGS OF THE NATIONAL ACADEMY OF SCIENCES OF THE UNITED STATES OF AMERICA, 2021, 118 (17)
  • [40] DynamicFL: Balancing Communication Dynamics and Client Manipulation for Federated Learning
    Chen, Bocheng
    Ivanov, Nikolay
    Wang, Guangjing
    Yan, Qiben
    2023 20TH ANNUAL IEEE INTERNATIONAL CONFERENCE ON SENSING, COMMUNICATION, AND NETWORKING, SECON, 2023,