Reuse of Client Models in Federated Learning

被引:0
|
作者
Cao, Bokai [1 ]
Wu, Weigang [1 ]
Zhan, Congcong [1 ]
Zhou, Jieying [1 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou, Peoples R China
关键词
Federated Learning; Mobile Edge Computing; Edge Learning; Deep Learning; Distributed Learning;
D O I
10.1109/SMARTCOMP55677.2022.00080
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated Learning (FL) has attracted a lot of attention from both academia and industry. In FL, user data no more need to be transmitted to the data center and each client device trains the deep model using its personal data so as to protect user privacy from being revealed. There exists two kinds of FL architecture, cloud based and edge based. Researchers have proposed the client-edge-cloud hierarchical FL system combining their advantage together to take the full advantage and avoid the defects. To improve the utilization ratio of local model parameters and data, so that enhance every single edge model accuracy, and ultimately achieve a more outstanding global model, we propose an algorithm Client Model Multiple Access (CMMA). CMMA allows clients associate with a set of edge servers, and uploads its training results to all servers in the set. That said, the local model of one client is reused by multiple edge servers. Such reuse can improve model performance particularly when the number of clients is small. Empirical experiments demonstrate the superiority of our scheme in different datasets, CNN models and data distributions. The results have not just shown the general suppress CMMA against hierarchical FL, but also validate the advantage of CMMA guaranteeing global model accuracy in unstable network or short of clients dilemma.
引用
收藏
页码:356 / 361
页数:6
相关论文
共 50 条
  • [1] A review on client selection models in federated learning
    Panigrahi, Monalisa
    Bharti, Sourabh
    Sharma, Arun
    [J]. WILEY INTERDISCIPLINARY REVIEWS-DATA MINING AND KNOWLEDGE DISCOVERY, 2023, 13 (06)
  • [2] Communication Efficient Federated Learning With Heterogeneous Structured Client Models
    Hu, Yao
    Sun, Xiaoyan
    Tian, Ye
    Song, Linqi
    Tan, Kay Chen
    [J]. IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2023, 7 (03): : 753 - 767
  • [3] Federated Noisy Client Learning
    Tam, Kahou
    Li, Li
    Han, Bo
    Xu, Chengzhong
    Fu, Huazhu
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, : 1 - 14
  • [4] Are You a Good Client? Client Classification in Federated Learning
    Jeong, Hyejun
    An, Jaeju
    Jeong, Jaehoon
    [J]. 12TH INTERNATIONAL CONFERENCE ON ICT CONVERGENCE (ICTC 2021): BEYOND THE PANDEMIC ERA WITH ICT CONVERGENCE INNOVATION, 2021, : 1691 - 1696
  • [5] Federated Learning with Client Availability Budgets
    Bao, Yunkai
    Drew, Steve
    Wang, Xin
    Zhou, Jiayu
    Niu, Xiaoguang
    [J]. IEEE CONFERENCE ON GLOBAL COMMUNICATIONS, GLOBECOM, 2023, : 1902 - 1907
  • [6] Client Selection for Federated Bayesian Learning
    Yang, Jiarong
    Liu, Yuan
    Kassab, Rahif
    [J]. IEEE JOURNAL ON SELECTED AREAS IN COMMUNICATIONS, 2023, 41 (04) : 915 - 928
  • [7] Client Selection in Hierarchical Federated Learning
    Trindade, Silvana
    da Fonseca, Nelson L. S.
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (17): : 28480 - 28495
  • [8] Training Heterogeneous Client Models using Knowledge Distillation in Serverless Federated Learning
    Chadha, Mohak
    Khera, Pulkit
    Gu, Jianfeng
    Abboud, Osama
    Gerndt, Michael
    [J]. 39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 997 - 1006
  • [9] Adaptive client and communication optimizations in Federated Learning
    Wu, Jiagao
    Wang, Yu
    Shen, Zhangchi
    Liu, Linfeng
    [J]. INFORMATION SYSTEMS, 2023, 116
  • [10] Online Client Scheduling for Fast Federated Learning
    Xu, Bo
    Xia, Wenchao
    Zhang, Jun
    Quek, Tony Q. S.
    Zhu, Hongbo
    [J]. IEEE WIRELESS COMMUNICATIONS LETTERS, 2021, 10 (07) : 1434 - 1438