Bandit-based Communication-Efficient Client Selection Strategies for Federated Learning

被引:42
|
作者
Cho, Yae Jee [1 ]
Gupta, Samarth [1 ]
Joshi, Gauri [1 ]
Yagan, Osman [1 ]
机构
[1] Carnegie Mellon Univ, Elect & Comp Engn, Pittsburgh, PA 15213 USA
基金
美国国家科学基金会;
关键词
distributed optimization; federated learning; fairness; client selection; multi-armed bandits;
D O I
10.1109/IEEECONF51394.2020.9443523
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Due to communication constraints and intermittent client availability in federated learning, only a subset of clients can participate in each training round. While most prior works assume uniform and unbiased client selection, recent work on biased client selection [1] has shown that selecting clients with higher local losses can improve error convergence speed. However, previously proposed biased selection strategies either require additional communication cost for evaluating the exact local loss or utilize stale local loss, which can even make the model diverge. In this paper, we present a bandit-based communication-efficient client selection strategy UCB-CS that achieves faster convergence with lower communication overhead. We also demonstrate how client selection can be used to improve fairness.
引用
收藏
页码:1066 / 1069
页数:4
相关论文
共 50 条
  • [21] Communication-Efficient Federated Learning with Heterogeneous Devices
    Chen, Zhixiong
    Yi, Wenqiang
    Liu, Yuanwei
    Nallanathan, Arumugam
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 3602 - 3607
  • [22] Communication-Efficient Federated Learning for Decision Trees
    Zhao, Shuo
    Zhu, Zikun
    Li, Xin
    Chen, Ying-Chi
    IEEE Transactions on Artificial Intelligence, 2024, 5 (11): : 5478 - 5492
  • [23] Communication-Efficient Federated Learning with Adaptive Quantization
    Mao, Yuzhu
    Zhao, Zihao
    Yan, Guangfeng
    Liu, Yang
    Lan, Tian
    Song, Linqi
    Ding, Wenbo
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2022, 13 (04)
  • [24] FedBoost: Communication-Efficient Algorithms for Federated Learning
    Hamer, Jenny
    Mohri, Mehryar
    Suresh, Ananda Theertha
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [25] Communication-Efficient Secure Aggregation for Federated Learning
    Ergun, Irem
    Sami, Hasin Us
    Guler, Basak
    2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 3881 - 3886
  • [26] Ternary Compression for Communication-Efficient Federated Learning
    Xu, Jinjin
    Du, Wenli
    Jin, Yaochu
    He, Wangli
    Cheng, Ran
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2022, 33 (03) : 1162 - 1176
  • [27] LESS-VFL: Communication-Efficient Feature Selection for Vertical Federated Learning
    Castiglia, Timothy
    Zhou, Yi
    Wang, Shiqiang
    Kadhe, Swanand
    Baracaldo, Nathalie
    Patterson, Stacy
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 202, 2023, 202
  • [28] Joint Model Pruning and Device Selection for Communication-Efficient Federated Edge Learning
    Liu, Shengli
    Yu, Guanding
    Yin, Rui
    Yuan, Jiantao
    Shen, Lei
    Liu, Chonghe
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2022, 70 (01) : 231 - 244
  • [29] Communication-Efficient Federated Learning With Adaptive Aggregation for Heterogeneous Client-Edge-Cloud Network
    Luo, Long
    Zhang, Chi
    Yu, Hongfang
    Sun, Gang
    Luo, Shouxi
    Dustdar, Schahram
    IEEE TRANSACTIONS ON SERVICES COMPUTING, 2024, 17 (06) : 3241 - 3255
  • [30] Layer-Based Communication-Efficient Federated Learning with Privacy Preservation
    Lian, Zhuotao
    Wang, Weizheng
    Huang, Huakun
    Su, Chunhua
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2022, E105D (02) : 256 - 263