Reducing communication in federated learning via efficient client sampling

被引:10
|
作者
Ribero, Monica [1 ]
Vikalo, Haris [1 ]
机构
[1] Univ Texas Austin, Dept Elect & Comp Engn, Austin, TX 78712 USA
基金
美国国家科学基金会;
关键词
Federated learning; Machine learning; Distributed optimization;
D O I
10.1016/j.patcog.2023.110122
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning (FL) ameliorates privacy concerns in settings where a central server coordinates learning from data distributed across many clients; rather than sharing the data, the clients train locally and report the models they learn to the server. Aggregation of local models requires communicating massive amounts of information between the clients and the server, consuming network bandwidth. We propose a novel framework for updating the global model in communication-constrained FL systems by requesting input only from the clients with informative updates, and estimating the local updates that are not communicated. Specifically, describing the progression of the model's weights by an Ornstein-Uhlenbeck process allows us to develop sampling strategy for selecting a subset of clients with significant weight updates; model updates of the clients not selected for communication are replaced by their estimates. We test this policy on realistic federated benchmark datasets and show that the proposed framework provides up to 50% reduction in communication while maintaining competitive or achieving superior performance compared to baselines. The proposed method represents a new line of strategies for communication-efficient FL that is orthogonal to the existing user-driven techniques, such as compression, thus complementing rather than aiming to replace those existing methods.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] Efficient Client Sampling with Compression in Heterogeneous Federated Learning
    Marnissi, Ouiame
    El Hammouti, Hajar
    Bergou, El Houcine
    IEEE INFOCOM 2024-IEEE CONFERENCE ON COMPUTER COMMUNICATIONS WORKSHOPS, INFOCOM WKSHPS 2024, 2024,
  • [2] FLAS: Computation and Communication Efficient Federated Learning via Adaptive Sampling
    Shu, Jiangang
    Zhang, Weizhe
    Zhou, Ying
    Cheng, Zhengtao
    Yang, Laurence T.
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2022, 9 (04): : 2003 - 2014
  • [3] Compressed Client Selection for Efficient Communication in Federated Learning
    Mohamed, Aissa Hadj
    Assumpcao, Nicolas R. G.
    Astudillo, Carlos A.
    de Souza, Allan M.
    Bittencourt, Luiz F.
    Villas, Leandro A.
    2023 IEEE 20TH CONSUMER COMMUNICATIONS & NETWORKING CONFERENCE, CCNC, 2023,
  • [4] Adaptive client selection with personalization for communication efficient Federated Learning
    de Souza, Allan M.
    Maciel, Filipe
    da Costa, Joahannes B. D.
    Bittencourt, Luiz F.
    Cerqueira, Eduardo
    Loureiro, Antonio A. F.
    Villas, Leandro A.
    AD HOC NETWORKS, 2024, 157
  • [5] Communication-Efficient Federated Learning With Data and Client Heterogeneity
    Zakerinia, Hossein
    Talaei, Shayan
    Nadiradze, Giorgi
    Alistarh, Dan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [6] Communication Efficient Federated Learning With Heterogeneous Structured Client Models
    Hu, Yao
    Sun, Xiaoyan
    Tian, Ye
    Song, Linqi
    Tan, Kay Chen
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2023, 7 (03): : 753 - 767
  • [7] Adaptive Client Sampling in Federated Learning via Online Learning with Bandit Feedback
    Zhao, Boxin
    Wang, Lingxiao
    Liu, Ziqi
    Zhang, Zhiqiang
    Zhou, Jun
    Chen, Chaochao
    Kolar, Mladen
    JOURNAL OF MACHINE LEARNING RESEARCH, 2025, 26 : 1 - 67
  • [8] Auxo: Efficient Federated Learning via Scalable Client Clustering
    Liu, Jiachen
    Lai, Fan
    Dai, Yinwei
    Akella, Aditya
    Madhyastha, Harsha V.
    Chowdhury, Mosharaf
    PROCEEDINGS OF THE 2023 ACM SYMPOSIUM ON CLOUD COMPUTING, SOCC 2023, 2023, : 125 - 141
  • [9] Greedy Shapley Client Selection for Communication-Efficient Federated Learning
    Singhal, Pranava
    Pandey, Shashi Raj
    Popovski, Petar
    IEEE Networking Letters, 2024, 6 (02): : 134 - 138
  • [10] iSample: Intelligent Client Sampling in Federated Learning
    Imani, HamidReza
    Anderson, Jeff
    El-Ghazawi, Tarek
    6TH IEEE INTERNATIONAL CONFERENCE ON FOG AND EDGE COMPUTING (ICFEC 2022), 2022, : 58 - 65