Reuse of Client Models in Federated Learning

被引:0
|
作者
Cao, Bokai [1 ]
Wu, Weigang [1 ]
Zhan, Congcong [1 ]
Zhou, Jieying [1 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou, Peoples R China
关键词
Federated Learning; Mobile Edge Computing; Edge Learning; Deep Learning; Distributed Learning;
D O I
10.1109/SMARTCOMP55677.2022.00080
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated Learning (FL) has attracted a lot of attention from both academia and industry. In FL, user data no more need to be transmitted to the data center and each client device trains the deep model using its personal data so as to protect user privacy from being revealed. There exists two kinds of FL architecture, cloud based and edge based. Researchers have proposed the client-edge-cloud hierarchical FL system combining their advantage together to take the full advantage and avoid the defects. To improve the utilization ratio of local model parameters and data, so that enhance every single edge model accuracy, and ultimately achieve a more outstanding global model, we propose an algorithm Client Model Multiple Access (CMMA). CMMA allows clients associate with a set of edge servers, and uploads its training results to all servers in the set. That said, the local model of one client is reused by multiple edge servers. Such reuse can improve model performance particularly when the number of clients is small. Empirical experiments demonstrate the superiority of our scheme in different datasets, CNN models and data distributions. The results have not just shown the general suppress CMMA against hierarchical FL, but also validate the advantage of CMMA guaranteeing global model accuracy in unstable network or short of clients dilemma.
引用
收藏
页码:356 / 361
页数:6
相关论文
共 50 条
  • [11] Client Selection with Bandwidth Allocation in Federated Learning
    Kuang, Junqian
    Yang, Miao
    Zhu, Hongbin
    Qian, Hua
    [J]. 2021 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2021,
  • [12] Privacy-Preserving Federated Learning With Improved Personalization and Poison Rectification of Client Models
    Cao, Yihao
    Zhang, Jianbiao
    Zhao, Yaru
    Shen, Hong
    Huang, Haoxiang
    [J]. IEEE Transactions on Information Forensics and Security, 2024, 19 : 8845 - 8859
  • [13] Towards Client Selection in Satellite Federated Learning
    Wu, Changhao
    He, Siyang
    Yin, Zengshan
    Guo, Chongbin
    [J]. APPLIED SCIENCES-BASEL, 2024, 14 (03):
  • [14] iSample: Intelligent Client Sampling in Federated Learning
    Imani, HamidReza
    Anderson, Jeff
    El-Ghazawi, Tarek
    [J]. 6TH IEEE INTERNATIONAL CONFERENCE ON FOG AND EDGE COMPUTING (ICFEC 2022), 2022, : 58 - 65
  • [15] Active Client Selection for Clustered Federated Learning
    Huang, Honglan
    Shi, Wei
    Feng, Yanghe
    Niu, Chaoyue
    Cheng, Guangquan
    Huang, Jincai
    Liu, Zhong
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 35 (11) : 1 - 15
  • [16] Rethinking Personalized Client Collaboration in Federated Learning
    Wu, Leijie
    Guo, Song
    Ding, Yaohong
    Wang, Junxiao
    Xu, Wenchao
    Zhan, Yufeng
    Kermarrec, Anne-Marie
    [J]. IEEE Transactions on Mobile Computing, 2024, 23 (12) : 11227 - 11239
  • [17] A General Theory for Client Sampling in Federated Learning
    Fraboni, Yann
    Vidal, Richard
    Kameni, Laetitia
    Lorenzi, Marco
    [J]. TRUSTWORTHY FEDERATED LEARNING, FL 2022, 2023, 13448 : 46 - 58
  • [18] An Efficient Client Selection for Wireless Federated Learning
    Chen, Jingyi
    Wang, Qiang
    Zhang, Wenqi
    [J]. 2023 28TH ASIA PACIFIC CONFERENCE ON COMMUNICATIONS, APCC 2023, 2023, : 291 - 296
  • [19] Dynamic Pricing for Client Recruitment in Federated Learning
    Wang, Xuehe
    Zheng, Shensheng
    Duan, Lingjie
    [J]. IEEE-ACM TRANSACTIONS ON NETWORKING, 2024, 32 (02) : 1273 - 1286
  • [20] A Review of Client Selection Methods in Federated Learning
    Mayhoub S.
    M. Shami T.
    [J]. Archives of Computational Methods in Engineering, 2024, 31 (02) : 1129 - 1152