Clustered Federated Learning Based on Momentum Gradient Descent for Heterogeneous Data

被引:1
|
作者
Zhao, Xiaoyi [1 ]
Xie, Ping [1 ]
Xing, Ling [1 ]
Zhang, Gaoyuan [1 ]
Ma, Huahong [1 ]
机构
[1] Henan Univ Sci & Technol, Sch Informat Engn, Luoyang 471023, Peoples R China
基金
中国国家自然科学基金;
关键词
clusters; data heterogeneity; federated learning; momentum gradient descent (MGD);
D O I
10.3390/electronics12091972
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Data heterogeneity may significantly deteriorate the performance of federated learning since the client's data distribution is divergent. To mitigate this issue, an effective method is to partition these clients into suitable clusters. However, existing clustered federated learning is only based on the gradient descent method, which leads to poor convergence performance. To accelerate the convergence rate, this paper proposes clustered federated learning based on momentum gradient descent (CFL-MGD) by integrating momentum and cluster techniques. In CFL-MGD, scattered clients are partitioned into the same cluster when they have the same learning tasks. Meanwhile, each client in the same cluster utilizes their own private data to update local model parameters through the momentum gradient descent. Moreover, we present gradient averaging and model averaging for global aggregation, respectively. To understand the proposed algorithm, we also prove that CFL-MGD converges at an exponential rate for smooth and strongly convex loss functions. Finally, we validate the effectiveness of CFL-MGD on CIFAR-10 and MNIST datasets.
引用
收藏
页数:24
相关论文
共 50 条
  • [21] Sign-Based Gradient Descent With Heterogeneous Data: Convergence and Byzantine Resilience
    Jin, Richeng
    Liu, Yuding
    Huang, Yufan
    He, Xiaofan
    Wu, Tianfu
    Dai, Huaiyu
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (02) : 3834 - 3846
  • [22] Aperture Shape Generation Based on Gradient Descent With Momentum
    Zhang, Liyuan
    Zhang, Pengcheng
    Yang, Jie
    Li, Jie
    Gui, Zhiguo
    IEEE ACCESS, 2019, 7 (157623-157632): : 157623 - 157632
  • [23] FedMDFG: Federated Learning with Multi-Gradient Descent and Fair Guidance
    Pan, Zibin
    Wang, Shuyi
    Li, Chi
    Wang, Haijin
    Tang, Xiaoying
    Zhao, Junhua
    THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 9364 - 9371
  • [24] Data classification based on fractional order gradient descent with momentum for RBF neural network
    Xue, Han
    Shao, Zheping
    Sun, Hongbo
    NETWORK-COMPUTATION IN NEURAL SYSTEMS, 2020, 31 (1-4) : 166 - 185
  • [25] Feature-Based Dataset Fingerprinting for Clustered Federated Learning on Medical Image Data
    Scheliga, Daniel
    Maeder, Patrick
    Seeland, Marco
    APPLIED ARTIFICIAL INTELLIGENCE, 2024, 38 (01)
  • [26] Clustered Federated Learning with Weighted Model Aggregation for Imbalanced Data
    Dong Wang
    Naifu Zhang
    Meixia Tao
    China Communications, 2022, 19 (08) : 41 - 56
  • [27] Clustered federated learning based on nonconvex pairwise fusion
    Yu, Xue
    Liu, Ziyi
    Wang, Wu
    Sun, Yifan
    INFORMATION SCIENCES, 2024, 678
  • [28] Clustered Federated Learning Based on Client's Prototypes
    Lai, Weimin
    Xu, Zirong
    Yan, Qiao
    PROCEEDINGS OF THE 2024 27 TH INTERNATIONAL CONFERENCE ON COMPUTER SUPPORTED COOPERATIVE WORK IN DESIGN, CSCWD 2024, 2024, : 909 - 914
  • [29] Clustered Federated Learning with Weighted Model Aggregation for Imbalanced Data
    Wang, Dong
    Zhang, Naifu
    Tao, Meixia
    CHINA COMMUNICATIONS, 2022, 19 (08) : 41 - 56
  • [30] Learning to learn by gradient descent by gradient descent
    Andrychowicz, Marcin
    Denil, Misha
    Colmenarejo, Sergio Gomez
    Hoffman, Matthew W.
    Pfau, David
    Schaul, Tom
    Shillingford, Brendan
    de Freitas, Nando
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29