Clustered Federated Learning Based on Momentum Gradient Descent for Heterogeneous Data

被引:1
|
作者
Zhao, Xiaoyi [1 ]
Xie, Ping [1 ]
Xing, Ling [1 ]
Zhang, Gaoyuan [1 ]
Ma, Huahong [1 ]
机构
[1] Henan Univ Sci & Technol, Sch Informat Engn, Luoyang 471023, Peoples R China
基金
中国国家自然科学基金;
关键词
clusters; data heterogeneity; federated learning; momentum gradient descent (MGD);
D O I
10.3390/electronics12091972
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Data heterogeneity may significantly deteriorate the performance of federated learning since the client's data distribution is divergent. To mitigate this issue, an effective method is to partition these clients into suitable clusters. However, existing clustered federated learning is only based on the gradient descent method, which leads to poor convergence performance. To accelerate the convergence rate, this paper proposes clustered federated learning based on momentum gradient descent (CFL-MGD) by integrating momentum and cluster techniques. In CFL-MGD, scattered clients are partitioned into the same cluster when they have the same learning tasks. Meanwhile, each client in the same cluster utilizes their own private data to update local model parameters through the momentum gradient descent. Moreover, we present gradient averaging and model averaging for global aggregation, respectively. To understand the proposed algorithm, we also prove that CFL-MGD converges at an exponential rate for smooth and strongly convex loss functions. Finally, we validate the effectiveness of CFL-MGD on CIFAR-10 and MNIST datasets.
引用
收藏
页数:24
相关论文
共 50 条
  • [1] A Remedy for Heterogeneous Data: Clustered Federated Learning with Gradient Trajectory
    Liu, Ruiqi
    Yu, Songcan
    Lan, Linsi
    Wang, Junbo
    Kant, Krishna
    Calleja, Neville
    BIG DATA MINING AND ANALYTICS, 2024, 7 (04): : 1050 - 1064
  • [2] Accelerating Federated Learning via Momentum Gradient Descent
    Liu, Wei
    Chen, Li
    Chen, Yunfei
    Zhang, Wenyi
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2020, 31 (08) : 1754 - 1766
  • [3] SoFL: Clustered Federated Learning Based on Dual Clustering for Heterogeneous Data
    Zhang, Jianfei
    Qiao, Zhiming
    ELECTRONICS, 2024, 13 (18)
  • [4] Privacy-Preserving Federated Learning based on Differential Privacy and Momentum Gradient Descent
    Weng, Shangyin
    Zhang, Lei
    Feng, Daquan
    Feng, Chenyuan
    Wang, Ruiyu
    Klaine, Paulo Valente
    Imran, Muhammad Ali
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [5] Adaptive Clustered Federated Learning for Heterogeneous Data in Edge Computing
    Biyao Gong
    Tianzhang Xing
    Zhidan Liu
    Junfeng Wang
    Xiuya Liu
    Mobile Networks and Applications, 2022, 27 : 1520 - 1530
  • [6] Adaptive Clustered Federated Learning for Heterogeneous Data in Edge Computing
    Gong, Biyao
    Xing, Tianzhang
    Liu, Zhidan
    Wang, Junfeng
    Liu, Xiuya
    Mobile Networks and Applications, 2022, 27 (04): : 1520 - 1530
  • [7] Adaptive Clustered Federated Learning for Heterogeneous Data in Edge Computing
    Gong, Biyao
    Xing, Tianzhang
    Liu, Zhidan
    Wang, Junfeng
    Liu, Xiuya
    MOBILE NETWORKS & APPLICATIONS, 2022, 27 (04): : 1520 - 1530
  • [8] Clustered Federated Learning in Heterogeneous Environment
    Yan, Yihan
    Tong, Xiaojun
    Wang, Shen
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (09) : 12796 - 12809
  • [9] Contrastive encoder pre-training-based clustered federated learning for heterogeneous data
    Tun, Ye Lin
    Nguyen, Minh N. H.
    Thwal, Chu Myaet
    Choi, Jinwoo
    Hong, Choong Seon
    NEURAL NETWORKS, 2023, 165 : 689 - 704
  • [10] Clustered Federated Learning With Adaptive Local Differential Privacy on Heterogeneous IoT Data
    He, Zaobo
    Wang, Lintao
    Cai, Zhipeng
    IEEE INTERNET OF THINGS JOURNAL, 2024, 11 (01): : 137 - 146