Adaptive clustering federated learning via similarity acceleration

被引:0
|
作者
Zhu S. [1 ,2 ]
Gu B. [1 ,2 ]
Sun G. [1 ,2 ]
机构
[1] School of Computer Science and Technology, Harbin University of Science and Technology, Harbin
[2] Heilongjiang Key Laboratory of Intelligent Information Processing and Application, Harbin
关键词
clustering; federated learning; geometric characteristic; personalization; positive feedback;
D O I
10.11959/j.issn.1000-436x.2024069
中图分类号
学科分类号
摘要
In order to solve the problem of model performance degradation caused by data heterogeneity in the federated learning process, it is necessary to consider personalizing in the federated model. A new adaptively clustering federated learning (ACFL) algorithm via similarity acceleration was proposed, achieving adaptive acceleration clustering based on geometric properties of local updates and the positive feedback mechanism during clients federated training. By dividing clients into different task clusters, clients with similar data distribution in the same cluster was cooperated to improve the performance of federated model. It did not need to determine the number of clusters in advance and iteratively divide the clients, so as to avoid the problems of high computational cost and slow convergence speed in the existing clustering federation methods while ensuring the performance of models. The effectiveness of ACFL was verified by using deep convolutional neural networks on commonly used datasets. The results show that the performance of ACFL is comparable to the clustered federated learning (CFL) algorithm, it is better than the traditional iterative federated cluster algorithm (IFCA), and has faster convergence speed. © 2024 Editorial Board of Journal on Communications. All rights reserved.
引用
收藏
页码:197 / 207
页数:10
相关论文
共 38 条
  • [21] HE C Y, ANNAVARAM M, AVESTIMEHR S., Group knowledge transfer: federated learning of large CNNs at the edge, (2020)
  • [22] HUANG L, SHEA A L, QIAN H N, Et al., Patient clustering improves efficiency of federated machine learning to predict mortality and hospital stay time using distributed electronic medical records, Journal of Biomedical Informatics, 99, (2019)
  • [23] GHOSH A, CHUNG J, YIN D, Et al., An efficient framework for clustered federated learning, IEEE Transactions on Information Theory, 68, 12, pp. 8076-8091, (2022)
  • [24] SATTLER F, MULLER K R, SAMEK W., Clustered federated learning:model-agnostic distributed multitask optimization under privacy constraints, IEEE Transactions on Neural Networks and Learning Systems, 32, 8, pp. 3710-3722, (2021)
  • [25] LI T, SAHU A K, ZAHEER M, Et al., Federated optimization in heterogeneous networks, (2018)
  • [26] YAO X, SUN L F., Continual local training for better initialization of federated models, Proceedings of the 2020 IEEE International Conference on Image Processing (ICIP), pp. 1736-1740, (2020)
  • [27] KARIMIREDDY S P, KALE S, MOHRI M, Et al., SCAFFOLD: stochastic controlled averaging for federated learning, Proceedings of the 37th International Conference on Machine Learning, pp. 5132-5143, (2020)
  • [28] FALLAH A, MOKHTARI A, OZDAGLAR A., Personalized federated learning with theoretical guarantees: a model-agnostic meta-learning approach, Proceedings of the 34th International Conference on Neural Information Processing Systems, pp. 3557-3568, (2020)
  • [29] DINH C T, TRAN N H, NGUYEN T D., Personalized federated learning with Moreau envelopes, (2020)
  • [30] KHODAK M, FLORINA-BALCAN M, TALWALKAR A., Adaptive gradient-based meta-learning methods, (2019)