REFL: Resource-Efficient Federated Learning

被引:11
|
作者
Abdelmoniem, Ahmed M. [1 ,3 ]
Sahu, Atal Narayan [2 ]
Canini, Marco [2 ]
Fahmy, Suhaib A. [2 ]
机构
[1] Queen Mary Univ London, London, England
[2] KAUST, Thuwal, Saudi Arabia
[3] Assiut Univ, Assiut, Egypt
关键词
D O I
10.1145/3552326.3567485
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Federated Learning (FL) enables distributed training by learners using local data, thereby enhancing privacy and reducing communication. However, it presents numerous challenges relating to the heterogeneity of the data distribution, device capabilities, and participant availability as deployments scale, which can impact both model convergence and bias. Existing FL schemes use random participant selection to improve the fairness of the selection process; however, this can result in inefficient use of resources and lower quality training. In this work, we systematically address the question of resource efficiency in FL, showing the benefits of intelligent participant selection, and incorporation of updates from straggling participants. We demonstrate how these factors enable resource efficiency while also improving trained model quality.
引用
收藏
页码:215 / 232
页数:18
相关论文
共 50 条
  • [1] Resource-Efficient Federated Learning for Network Intrusion Detection
    Doriguzzi-Corin, Roberto
    Cretti, Silvio
    Siracusa, Domenico
    [J]. 2024 IEEE 10TH INTERNATIONAL CONFERENCE ON NETWORK SOFTWARIZATION, NETSOFT 2024, 2024, : 357 - 362
  • [2] RESOURCE-EFFICIENT FEDERATED LEARNING ROBUST TO COMMUNICATION ERRORS
    Lari, Ehsan
    Gogineni, Vinay Chakravarthi
    Arablouei, Reza
    Werner, Stefan
    [J]. 2023 IEEE STATISTICAL SIGNAL PROCESSING WORKSHOP, SSP, 2023, : 265 - 269
  • [3] Toward Resource-Efficient Federated Learning in Mobile Edge Computing
    Yu, Rong
    Li, Peichun
    [J]. IEEE NETWORK, 2021, 35 (01): : 148 - 155
  • [4] FedTrip: A Resource-Efficient Federated Learning Method with Triplet Regularization
    Li, Xujing
    Liu, Min
    Sun, Sheng
    Wang, Yuwei
    Jiang, Hui
    Jiang, Xuefeng
    [J]. 2023 IEEE INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM, IPDPS, 2023, : 809 - 819
  • [5] Resource-Efficient Federated Learning with Hierarchical Aggregation in Edge Computing
    Wang, Zhiyuan
    Xu, Hongli
    Liu, Jianchun
    Huang, He
    Qiao, Chunming
    Zhao, Yangming
    [J]. IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2021), 2021,
  • [6] MAS: Towards Resource-Efficient Federated Multiple-Task Learning
    Zhuang, Weiming
    Wen, Yonggang
    Lyu, Lingjuan
    Zhang, Shuai
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 23357 - 23367
  • [7] FLrce: Resource-Efficient Federated Learning With Early-Stopping Strategy
    Niu, Ziru
    Dong, Hai
    Qin, A.K.
    Gu, Tao
    [J]. IEEE Transactions on Mobile Computing, 2024, 23 (12) : 14514 - 14529
  • [8] Resource-Efficient Hierarchical Collaborative Federated Learning in Heterogeneous Internet of Things
    Wang, Ruyan
    Chen, Wei
    Zhang, Puning
    Wu, Dapeng
    Yang, Zhigang
    [J]. JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2023, 45 (08) : 2847 - 2855
  • [9] Resource-efficient federated learning over IoAT for rice leaf disease classification
    Aggarwal, Meenakshi
    Khullar, Vikas
    Goyal, Nitin
    Prola, Thomas Andre
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2024, 221
  • [10] Towards a resource-efficient semi-asynchronous federated learning for heterogeneous devices
    Sasindran, Zitha
    Yelchuri, Harsha
    Prabhakar, T. V.
    [J]. 2024 NATIONAL CONFERENCE ON COMMUNICATIONS, NCC, 2024,