iSample: Intelligent Client Sampling in Federated Learning

被引:11
|
作者
Imani, HamidReza [1 ]
Anderson, Jeff [1 ]
El-Ghazawi, Tarek [1 ]
机构
[1] George Washington Univ, Dept Elect & Comp Engn, Washington, DC 20052 USA
基金
美国国家科学基金会;
关键词
federated learning; heterogeneous systems; resource constrained devices; edge computing; machine learning;
D O I
10.1109/ICFEC54809.2022.00015
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
The pervasiveness of AI in society has made machine learning (ML) an invaluable tool for mobile and internet-of-things (IoT) devices. While the aggregate amount of data yielded by those devices is sufficient for training an accurate model, the data available to any one device is limited. Therefore, augmenting the learning at any of the devices with the experience from observations associated with the rest of the devices will be necessary. This, however, can dramatically increase the bandwidth requirements. Prior work has led to the development of Federated Learning (FL), where instead of exchanging data, client devices can only share weights to learn from one another. However, heterogeneity in device resource availability and network conditions still impose limitations on training performance. In order to improve performance while maintaining good levels of accuracy, we introduce iSample. iSample, an intelligent sampling technique, selects clients by jointly considering known network performance and model quality parameters, allowing the minimization of training time. We compare iSample with other federated learning approaches and show that iSample improves the performance of the global model, especially in the earlier stages of training, while decreasing the training time for both CNN and VGG by 27% and 39%, respectively.
引用
收藏
页码:58 / 65
页数:8
相关论文
共 50 条
  • [31] Client Selection with Bandwidth Allocation in Federated Learning
    Kuang, Junqian
    Yang, Miao
    Zhu, Hongbin
    Qian, Hua
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [32] Online Client Scheduling for Fast Federated Learning
    Xu, Bo
    Xia, Wenchao
    Zhang, Jun
    Quek, Tony Q. S.
    Zhu, Hongbo
    IEEE WIRELESS COMMUNICATIONS LETTERS, 2021, 10 (07) : 1434 - 1438
  • [33] Towards Client Selection in Satellite Federated Learning
    Wu, Changhao
    He, Siyang
    Yin, Zengshan
    Guo, Chongbin
    APPLIED SCIENCES-BASEL, 2024, 14 (03):
  • [34] FedCA: Efficient Federated Learning with Client Autonomy
    Lyu, Na
    Jiang, Zhifeng
    Shen, Zhi
    Zhang, Jiayi
    Guo, Minyi
    Chen, Chen
    Chen, Quan
    53RD INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, ICPP 2024, 2024, : 494 - 503
  • [35] A review on client selection models in federated learning
    Panigrahi, Monalisa
    Bharti, Sourabh
    Sharma, Arun
    WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2023, 13 (06)
  • [36] An Efficient Client Selection for Wireless Federated Learning
    Chen, Jingyi
    Wang, Qiang
    Zhang, Wenqi
    2023 28TH ASIA PACIFIC CONFERENCE ON COMMUNICATIONS, APCC 2023, 2023, : 291 - 296
  • [37] Active Client Selection for Clustered Federated Learning
    Huang, Honglan
    Shi, Wei
    Feng, Yanghe
    Niu, Chaoyue
    Cheng, Guangquan
    Huang, Jincai
    Liu, Zhong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16424 - 16438
  • [38] Rethinking Personalized Client Collaboration in Federated Learning
    Wu, Leijie
    Guo, Song
    Ding, Yaohong
    Wang, Junxiao
    Xu, Wenchao
    Zhan, Yufeng
    Kermarrec, Anne-Marie
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (12) : 11227 - 11239
  • [39] Active Client Selection for Clustered Federated Learning
    Huang, Honglan
    Shi, Wei
    Feng, Yanghe
    Niu, Chaoyue
    Cheng, Guangquan
    Huang, Jincai
    Liu, Zhong
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (11) : 16424 - 16438
  • [40] Dynamic Pricing for Client Recruitment in Federated Learning
    Wang, Xuehe
    Zheng, Shensheng
    Duan, Lingjie
    IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (02) : 1273 - 1286