Adaptive clustering federated learning via similarity acceleration

被引:0
|
作者
Zhu S. [1 ,2 ]
Gu B. [1 ,2 ]
Sun G. [1 ,2 ]
机构
[1] School of Computer Science and Technology, Harbin University of Science and Technology, Harbin
[2] Heilongjiang Key Laboratory of Intelligent Information Processing and Application, Harbin
关键词
clustering; federated learning; geometric characteristic; personalization; positive feedback;
D O I
10.11959/j.issn.1000-436x.2024069
中图分类号
学科分类号
摘要
In order to solve the problem of model performance degradation caused by data heterogeneity in the federated learning process, it is necessary to consider personalizing in the federated model. A new adaptively clustering federated learning (ACFL) algorithm via similarity acceleration was proposed, achieving adaptive acceleration clustering based on geometric properties of local updates and the positive feedback mechanism during clients federated training. By dividing clients into different task clusters, clients with similar data distribution in the same cluster was cooperated to improve the performance of federated model. It did not need to determine the number of clusters in advance and iteratively divide the clients, so as to avoid the problems of high computational cost and slow convergence speed in the existing clustering federation methods while ensuring the performance of models. The effectiveness of ACFL was verified by using deep convolutional neural networks on commonly used datasets. The results show that the performance of ACFL is comparable to the clustered federated learning (CFL) algorithm, it is better than the traditional iterative federated cluster algorithm (IFCA), and has faster convergence speed. © 2024 Editorial Board of Journal on Communications. All rights reserved.
引用
收藏
页码:197 / 207
页数:10
相关论文
共 38 条
  • [11] CHEN F W, LONG G D, WU Z H, Et al., Personalized federated learning with graph, (2022)
  • [12] LONG G D, XIE M, SHEN T, Et al., Multi-center federated learning: clients clustering for better personalization, World Wide Web, 26, 1, pp. 481-500, (2023)
  • [13] BRIGGS C, FAN Z, ANDRAS P., Federated learning with hierarchical clustering of local updates to improve training on non-IID data, Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), pp. 1-9, (2020)
  • [14] SUN B C, FENG J S, SAENKO K., Return of frustratingly easy domain adaptation, Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence, pp. 2058-2065, (2016)
  • [15] VAHIDIAN S, MORAFAH M, CHEN C, Et al., Rethinking data heterogeneity in federated learning: introducing a new notion and standard benchmarks, (2022)
  • [16] CORINZIA L, BUHMANN J M., Variational federated multi-task learning, (2019)
  • [17] HUANG Y T, CHU L Y, ZHOU Z R, Et al., Personalized cross-silo federated learning on non-IID data, Proceedings of the AAAI Conference on Artificial Intelligence, 35, 9, pp. 7865-7873, (2021)
  • [18] SHOHAM N, AVIDOR T, KEREN A, Et al., Overcoming forgetting in federated learning on non-IID data, (2019)
  • [19] LI D L, WANG J P., FedMD: heterogenous federated learning via model distillation, (2019)
  • [20] LIN T, KONG L J, STICH S U, Et al., Ensemble distillation for robust model fusion in federated learning, (2020)