iSample: Intelligent Client Sampling in Federated Learning

被引:7
|
作者
Imani, HamidReza [1 ]
Anderson, Jeff [1 ]
El-Ghazawi, Tarek [1 ]
机构
[1] George Washington Univ, Dept Elect & Comp Engn, Washington, DC 20052 USA
基金
美国国家科学基金会;
关键词
federated learning; heterogeneous systems; resource constrained devices; edge computing; machine learning;
D O I
10.1109/ICFEC54809.2022.00015
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The pervasiveness of AI in society has made machine learning (ML) an invaluable tool for mobile and internet-of-things (IoT) devices. While the aggregate amount of data yielded by those devices is sufficient for training an accurate model, the data available to any one device is limited. Therefore, augmenting the learning at any of the devices with the experience from observations associated with the rest of the devices will be necessary. This, however, can dramatically increase the bandwidth requirements. Prior work has led to the development of Federated Learning (FL), where instead of exchanging data, client devices can only share weights to learn from one another. However, heterogeneity in device resource availability and network conditions still impose limitations on training performance. In order to improve performance while maintaining good levels of accuracy, we introduce iSample. iSample, an intelligent sampling technique, selects clients by jointly considering known network performance and model quality parameters, allowing the minimization of training time. We compare iSample with other federated learning approaches and show that iSample improves the performance of the global model, especially in the earlier stages of training, while decreasing the training time for both CNN and VGG by 27% and 39%, respectively.
引用
收藏
页码:58 / 65
页数:8
相关论文
共 50 条
  • [1] A General Theory for Client Sampling in Federated Learning
    Fraboni, Yann
    Vidal, Richard
    Kameni, Laetitia
    Lorenzi, Marco
    [J]. TRUSTWORTHY FEDERATED LEARNING, FL 2022, 2023, 13448 : 46 - 58
  • [2] DELTA: Diverse Client Sampling for Fasting Federated Learning
    Wang, Lin
    Guo, Yongxin
    Lin, Tao
    Tang, Xiaoying
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [3] Differentially Private Federated Learning With Importance Client Sampling
    Chen, Lin
    Ding, Xiaofeng
    Li, Mengqi
    Jin, Hai
    [J]. IEEE TRANSACTIONS ON CONSUMER ELECTRONICS, 2024, 70 (01) : 3635 - 3649
  • [4] FedGraph: Federated Graph Learning With Intelligent Sampling
    Chen, Fahao
    Li, Peng
    Miyazaki, Toshiaki
    Wu, Celimuge
    [J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (08) : 1775 - 1786
  • [5] Optimal Mechanism Design for Heterogeneous Client Sampling in Federated Learning
    Liao, Guocheng
    Luo, Bing
    Feng, Yutong
    Zhang, Meng
    Chen, Xu
    [J]. IEEE Transactions on Mobile Computing, 2024, 23 (11) : 10598 - 10609
  • [6] Probabilistic Client Sampling and Power Allocation for Wireless Federated Learning
    Xu, Wen
    Liang, Ben
    Boudreau, Gary
    Sokun, Hamza
    [J]. 2023 IEEE 34TH ANNUAL INTERNATIONAL SYMPOSIUM ON PERSONAL, INDOOR AND MOBILE RADIO COMMUNICATIONS, PIMRC, 2023,
  • [7] Reducing communication in federated learning via efficient client sampling
    Ribero, Monica
    Vikalo, Haris
    [J]. PATTERN RECOGNITION, 2024, 148
  • [8] Adaptive Heterogeneous Client Sampling for Federated Learning Over Wireless Networks
    Luo, Bing
    Xiao, Wenli
    Wang, Shiqiang
    Huang, Jianwei
    Tassiulas, Leandros
    [J]. IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (10) : 9663 - 9677
  • [9] Differentially Private Federated Learning with Shuffling and Client Self-Sampling
    Girgis, Antonious M.
    Data, Deepesh
    Diggavi, Suhas
    [J]. 2021 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY (ISIT), 2021, : 338 - 343
  • [10] Tackling System and Statistical Heterogeneity for Federated Learning with Adaptive Client Sampling
    Luo, Bing
    Xiao, Wenli
    Wang, Shiqiang
    Huang, Jianwei
    Tassiulas, Leandros
    [J]. IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2022), 2022, : 1739 - 1748