Clustered Federated Learning Based on Momentum Gradient Descent for Heterogeneous Data

被引:1
|
作者
Zhao, Xiaoyi [1 ]
Xie, Ping [1 ]
Xing, Ling [1 ]
Zhang, Gaoyuan [1 ]
Ma, Huahong [1 ]
机构
[1] Henan Univ Sci & Technol, Sch Informat Engn, Luoyang 471023, Peoples R China
基金
中国国家自然科学基金;
关键词
clusters; data heterogeneity; federated learning; momentum gradient descent (MGD);
D O I
10.3390/electronics12091972
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Data heterogeneity may significantly deteriorate the performance of federated learning since the client's data distribution is divergent. To mitigate this issue, an effective method is to partition these clients into suitable clusters. However, existing clustered federated learning is only based on the gradient descent method, which leads to poor convergence performance. To accelerate the convergence rate, this paper proposes clustered federated learning based on momentum gradient descent (CFL-MGD) by integrating momentum and cluster techniques. In CFL-MGD, scattered clients are partitioned into the same cluster when they have the same learning tasks. Meanwhile, each client in the same cluster utilizes their own private data to update local model parameters through the momentum gradient descent. Moreover, we present gradient averaging and model averaging for global aggregation, respectively. To understand the proposed algorithm, we also prove that CFL-MGD converges at an exponential rate for smooth and strongly convex loss functions. Finally, we validate the effectiveness of CFL-MGD on CIFAR-10 and MNIST datasets.
引用
收藏
页数:24
相关论文
共 50 条
  • [41] Toward data efficient anomaly detection in heterogeneous edge-cloud environments using clustered federated learning
    Wei, Zongpu
    Wang, Jinsong
    Zhao, Zening
    Shi, Kai
    FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2025, 164
  • [42] Robust Federated Learning for Heterogeneous Model and Data
    Madni, Hussain Ahmad
    Umer, Rao Muhammad
    Foresti, Gian Luca
    INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2024, 34 (04)
  • [43] Gradient Scheduling With Global Momentum for Asynchronous Federated Learning in Edge Environment
    Wang, Haozhao
    Li, Ruixuan
    Li, Chengjie
    Zhou, Pan
    Li, Yuhua
    Xu, Wenchao
    Guo, Song
    IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (19) : 18817 - 18828
  • [44] Ensuring Fairness and Gradient Privacy in Personalized Heterogeneous Federated Learning
    Lewis, Cody
    Varadharajan, Vijay
    Noman, Nasimul
    Tupakula, Uday
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2024, 15 (03)
  • [45] Depersonalized Federated Learning: Tackling Statistical Heterogeneity by Alternating Stochastic Gradient Descent
    Zhou, Yujie
    Li, Zhidu
    Tang, Tong
    Wang, Ruyan
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1988 - 1993
  • [46] Boosting the Federation: Cross-Silo Federated Learning without Gradient Descent
    Polato, Mirko
    Esposito, Roberto
    Aldinucci, Marco
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [47] Federated Generalized Bayesian Learning via Distributed Stein Variational Gradient Descent
    Kassab, Rahif
    Simeone, Osvaldo
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 2180 - 2192
  • [48] On the Hyperparameters in Stochastic Gradient Descent with Momentum
    Shi, Bin
    JOURNAL OF MACHINE LEARNING RESEARCH, 2024, 25
  • [49] Federated Learning with Class Balanced Loss Optimized by Implicit Stochastic Gradient Descent
    Zhou, Jincheng
    Zheng, Maoxing
    SOFT COMPUTING IN DATA SCIENCE, SCDS 2023, 2023, 1771 : 121 - 135
  • [50] Soft-Sign Stochastic Gradient Descent Algorithm for Wireless Federated Learning
    Lee, Seunghoon
    Park, Chanho
    Hong, Songnam
    Eldar, Yonina C.
    Lee, Namyoon
    SPAWC 2021: 2021 IEEE 22ND INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC 2021), 2020, : 241 - 245