FedAdaSS: Federated Learning with Adaptive Parameter Server Selection Based on Elastic Cloud Resources

被引:1
|
作者
Xu, Yuwei [1 ]
Zhao, Baokang [1 ]
Zhou, Huan [1 ]
Su, Jinshu [1 ]
机构
[1] Natl Univ Def Technol, Sch Comp, Changsha 410000, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Machine learning systems; federated learning; server selection; artificial intelligence of things; non-IID data;
D O I
10.32604/cmes.2024.053462
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
The rapid expansion of artificial intelligence (AI) applications has raised significant concerns about user privacy, prompting the development of privacy-preserving machine learning (ML) paradigms such as federated learning (FL). FL enables the distributed training of ML models, keeping data on local devices and thus addressing the privacy concerns of users. However, challenges arise from the heterogeneous nature of mobile client devices, partial engagement of training, and non-independent identically distributed (non-IID) data distribution, leading to performance degradation and optimization objective bias in FL training. With the development of 5G/6G networks and the integration of cloud computing edge computing resources, globally distributed cloud computing resources can be effectively utilized to optimize the FL process. Through the specific parameters of the server through the selection mechanism, it does not increase the monetary cost and reduces the network latency overhead, but also balances the objectives of communication optimization and low engagement mitigation that cannot be achieved simultaneously in a single-server framework of existing works. In this paper, we propose the FedAdaSS algorithm, an adaptive parameter server selection mechanism designed to optimize the training efficiency in each round of FL training by selecting the most appropriate server as the parameter server. Our approach leverages the flexibility of cloud resource computing power, and allows organizers to strategically select servers for data broadcasting and aggregation, thus improving training performance while maintaining cost efficiency. The FedAdaSS algorithm estimates the utility of client systems and servers and incorporates an adaptive random reshuffling strategy that selects the optimal server in each round of the training process. Theoretical analysis confirms the convergence of FedAdaSS under strong convexity and L-smooth assumptions, and comparative experiments within the FLSim framework demonstrate a reduction in training round-to-accuracy by 12%-20% compared to the Federated Averaging (FedAvg) with random reshuffling method under unique server. Furthermore, FedAdaSS effectively mitigates performance loss caused by low client engagement, reducing the loss indicator by 50%.
引用
收藏
页码:609 / 629
页数:21
相关论文
共 50 条
  • [41] Clustered federated learning enhanced by DAG-based blockchain with adaptive tip selection algorithm
    Xue, Xiaofeng
    Mao, Haokun
    Li, Qiong
    Guan, Xin
    INTERNET OF THINGS, 2025, 31
  • [42] EdgeFedNet: Edge Server Based Communication and Computation Efficient Federated Learning
    L. Gowtham
    B. Annappa
    D. N. Sachin
    SN Computer Science, 6 (3)
  • [43] Optimized and Adaptive Federated Learning for Straggler-Resilient Device Selection
    Banerjee, Sourasekhar
    Vu, Xuan-Son
    Bhuyan, Monowar
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [44] Optimal Model Transfer and Dynamic Parameter Server Selection for Efficient Federated Learning in IoT-Edge Systems With Non-IID Data
    Mengistu, Tesfahunegn Minwuyelet
    Lin, Jenn-Wei
    Kuo, Po-Hsien
    Kim, Taewoon
    IEEE ACCESS, 2024, 12 : 157954 - 157974
  • [45] Joint Data Allocation and LSTM-Based Server Selection With Parallelized Federated Learning in LEO Satellite IoT Networks
    Qin, Pengxiang
    Xu, Dongyang
    Liu, Lei
    Dong, Mianxiong
    Mumtaz, Shahid
    Guizani, Mohsen
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (06): : 6259 - 6271
  • [46] Elastic Management of Cloud Applications using Adaptive Reinforcement Learning
    Lolos, Konstantinos
    Konstantinou, Ioannis
    Kantere, Verena
    Koziris, Nectarios
    2017 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2017, : 203 - 212
  • [47] PAS: Towards Accurate and Efficient Federated Learning with Parameter-Adaptive Synchronization
    Gan, Zuo
    Chen, Chen
    Zhang, Jiayi
    Zeng, Gaoxiong
    Zhu, Yifei
    Zhao, Jieru
    Chen, Quan
    Guo, Minyi
    2024 IEEE/ACM 32ND INTERNATIONAL SYMPOSIUM ON QUALITY OF SERVICE, IWQOS, 2024,
  • [48] Embedding based learning for collection selection in federated search
    Garba, Adamu
    Khalid, Shah
    Ullah, Irfan
    Khusro, Shah
    Mumin, Diyawu
    DATA TECHNOLOGIES AND APPLICATIONS, 2020, 54 (05) : 703 - 717
  • [49] Contribution-based Federated Learning client selection
    Lin, Weiwei
    Xu, Yinhai
    Liu, Bo
    Li, Dongdong
    Huang, Tiansheng
    Shi, Fang
    INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2022, 37 (10) : 7235 - 7260
  • [50] Blended Elastic Scaling Method for Cloud Resources Following Reinforcement Learning
    Wu X.
    Zhang C.
    Yuan S.
    Ren X.
    Wang W.
    Hsi-An Chiao Tung Ta Hsueh/Journal of Xi'an Jiaotong University, 2022, 56 (01): : 142 - 150