FedAdaSS: Federated Learning with Adaptive Parameter Server Selection Based on Elastic Cloud Resources

被引:1
|
作者
Xu, Yuwei [1 ]
Zhao, Baokang [1 ]
Zhou, Huan [1 ]
Su, Jinshu [1 ]
机构
[1] Natl Univ Def Technol, Sch Comp, Changsha 410000, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Machine learning systems; federated learning; server selection; artificial intelligence of things; non-IID data;
D O I
10.32604/cmes.2024.053462
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The rapid expansion of artificial intelligence (AI) applications has raised significant concerns about user privacy, prompting the development of privacy-preserving machine learning (ML) paradigms such as federated learning (FL). FL enables the distributed training of ML models, keeping data on local devices and thus addressing the privacy concerns of users. However, challenges arise from the heterogeneous nature of mobile client devices, partial engagement of training, and non-independent identically distributed (non-IID) data distribution, leading to performance degradation and optimization objective bias in FL training. With the development of 5G/6G networks and the integration of cloud computing edge computing resources, globally distributed cloud computing resources can be effectively utilized to optimize the FL process. Through the specific parameters of the server through the selection mechanism, it does not increase the monetary cost and reduces the network latency overhead, but also balances the objectives of communication optimization and low engagement mitigation that cannot be achieved simultaneously in a single-server framework of existing works. In this paper, we propose the FedAdaSS algorithm, an adaptive parameter server selection mechanism designed to optimize the training efficiency in each round of FL training by selecting the most appropriate server as the parameter server. Our approach leverages the flexibility of cloud resource computing power, and allows organizers to strategically select servers for data broadcasting and aggregation, thus improving training performance while maintaining cost efficiency. The FedAdaSS algorithm estimates the utility of client systems and servers and incorporates an adaptive random reshuffling strategy that selects the optimal server in each round of the training process. Theoretical analysis confirms the convergence of FedAdaSS under strong convexity and L-smooth assumptions, and comparative experiments within the FLSim framework demonstrate a reduction in training round-to-accuracy by 12%-20% compared to the Federated Averaging (FedAvg) with random reshuffling method under unique server. Furthermore, FedAdaSS effectively mitigates performance loss caused by low client engagement, reducing the loss indicator by 50%.
引用
收藏
页码:609 / 629
页数:21
相关论文
共 50 条
  • [1] A Traffic Model Based Approach to Parameter Server Design in Federated Learning Processes
    Tedeschini, Bernardo Camajori
    Savazzi, Stefano
    Nicoli, Monica
    IEEE COMMUNICATIONS LETTERS, 2023, 27 (07) : 1774 - 1778
  • [2] Enhancing Federated Learning With Server-Side Unlabeled Data by Adaptive Client and Data Selection
    Xu, Yang
    Wang, Lun
    Xu, Hongli
    Liu, Jianchun
    Wang, Zhiyuan
    Huang, Liusheng
    IEEE TRANSACTIONS ON MOBILE COMPUTING, 2024, 23 (04) : 2813 - 2831
  • [3] Learning-Based Virtual Machine Selection in Cloud Server Consolidation
    Li, Huixi
    Xiao, Yinhao
    Shen, YongLuo
    MATHEMATICAL PROBLEMS IN ENGINEERING, 2022, 2022
  • [4] Adaptive Participant Selection in Heterogeneous Federated Learning
    Albelaihi, Rana
    Sun, Xiang
    Craft, Warren D.
    Yu, Liangkun
    Wang, Chonggang
    2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [5] Adaptive Application Master for Elastic Web Server Farms for Cloud based on Hadoop YARN
    Shah, Purshotam
    2013 INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND BIG DATA (CLOUDCOM-ASIA), 2013, : 461 - 464
  • [6] Pareto-Based Adaptive Resources Selection Model in Hybrid Cloud Environment
    Naik, Ketaki Bhalchandra
    Gandhi, G. Meera
    Patil, S. H.
    IETE JOURNAL OF RESEARCH, 2021, 67 (02) : 240 - 252
  • [7] Adaptive and Efficient Participant Selection in Vertical Federated Learning
    Huang, Jiahui
    Zhang, Lan
    Li, Anran
    Cheng, Haoran
    Xu, Jiexin
    Song, Hongmei
    2023 19TH INTERNATIONAL CONFERENCE ON MOBILITY, SENSING AND NETWORKING, MSN 2023, 2023, : 455 - 462
  • [8] MAB-based Client Selection for Federated Learning with Uncertain Resources in Mobile Networks
    Yoshida, Naoya
    Nishio, Takayuki
    Morikura, Masahiro
    Yamamoto, Koji
    2020 IEEE GLOBECOM WORKSHOPS (GC WKSHPS), 2020,
  • [9] Adaptive device sampling and deadline determination for cloud-based heterogeneous federated learning
    Deyu Zhang
    Wang Sun
    Zi-Ang Zheng
    Wenxin Chen
    Shiwen He
    Journal of Cloud Computing, 12
  • [10] Adaptive device sampling and deadline determination for cloud-based heterogeneous federated learning
    Zhang, Deyu
    Sun, Wang
    Zheng, Zi-Ang
    Chen, Wenxin
    He, Shiwen
    JOURNAL OF CLOUD COMPUTING-ADVANCES SYSTEMS AND APPLICATIONS, 2023, 12 (01):