Federated Learning with Class Balanced Loss Optimized by Implicit Stochastic Gradient Descent

被引:0
|
作者
Zhou, Jincheng [1 ,3 ]
Zheng, Maoxing [2 ]
机构
[1] Qiannan Normal Univ Nationalities, Sch Comp & Informat, Duyun 558000, Peoples R China
[2] Baoji Univ Arts & Sci, Sch Comp Sci, Baoji 721007, Peoples R China
[3] Key Lab Complex Syst & Intelligent Optimizat Guiz, Duyun 558000, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Fast convergence; Optimization algorithm; System heterogeneity; Data heterogeneity; Implicit stochastic gradient descent; Global model; Central server; Distributed machine learning; Federated learning;
D O I
10.1007/978-981-99-0405-1_9
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Federated learning is a paradigm for distributed machine learning in which a central server interacts with a large number of remote devices to create the optimal global model. System and data heterogeneity are now the two largest impediments to federated learning. This work suggests a federated learning strategy based on stochastic gradient descent optimization as a solution to the problem of heterogeneity-induced slow convergence, or even non-convergence, of the global model. This work estimates the average global gradient using locally uploaded model parameters without computing the first derivative or updating global model parameters through gradient descent. Allowing the global model to be used with fewer communication rounds. Obtain faster and more reliable convergence results. In experiments simulating varying degrees of heterogeneous settings, the strategy proposed in this work delivered faster and more stable convergence than FedProx and FedAvg. This work offers a strategy that decreases the number of communication cycles on highly heterogeneous synthetic datasets by around 50% compared to FedProx, therefore considerably enhancing the stability and durability of federated learning.
引用
收藏
页码:121 / 135
页数:15
相关论文
共 50 条
  • [1] From Gradient Flow on Population Loss to Learning with Stochastic Gradient Descent
    Sekhari, Ayush
    Kale, Satyen
    Lee, Jason D.
    De Sa, Chris
    Sridharan, Karthik
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [2] Depersonalized Federated Learning: Tackling Statistical Heterogeneity by Alternating Stochastic Gradient Descent
    Zhou, Yujie
    Li, Zhidu
    Tang, Tong
    Wang, Ruyan
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 1988 - 1993
  • [3] Soft-Sign Stochastic Gradient Descent Algorithm for Wireless Federated Learning
    Lee, Seunghoon
    Park, Chanho
    Hong, Songnam
    Eldar, Yonina C.
    Lee, Namyoon
    SPAWC 2021: 2021 IEEE 22ND INTERNATIONAL WORKSHOP ON SIGNAL PROCESSING ADVANCES IN WIRELESS COMMUNICATIONS (IEEE SPAWC 2021), 2020, : 241 - 245
  • [4] Implicit Gradient Alignment in Distributed and Federated Learning
    Dandi, Yatin
    Barba, Luis
    Jaggi, Martin
    THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 6454 - 6462
  • [5] Stochastic Smoothed Gradient Descent Ascent for Federated Minimax Optimization
    Shen, Wei
    Huang, Minhui
    Zhang, Jiawei
    Shen, Cong
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [6] Optimized convergence of stochastic gradient descent by weighted averaging
    Hagedorn, Melinda
    Jarre, Florian
    OPTIMIZATION METHODS & SOFTWARE, 2024, 39 (04): : 699 - 724
  • [7] Network Gradient Descent Algorithm for Decentralized Federated Learning
    Wu, Shuyuan
    Huang, Danyang
    Wang, Hansheng
    JOURNAL OF BUSINESS & ECONOMIC STATISTICS, 2023, 41 (03) : 806 - 818
  • [8] Accelerating Federated Learning via Momentum Gradient Descent
    Liu, Wei
    Chen, Li
    Chen, Yunfei
    Zhang, Wenyi
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2020, 31 (08) : 1754 - 1766
  • [9] Scalable statistical inference for averaged implicit stochastic gradient descent
    Fang, Yixin
    SCANDINAVIAN JOURNAL OF STATISTICS, 2019, 46 (04) : 987 - 1002
  • [10] STOCHASTIC GRADIENT DESCENT FOR SPECTRAL EMBEDDING WITH IMPLICIT ORTHOGONALITY CONSTRAINT
    El Gheche, Mireille
    Chierchia, Giovanni
    Frossard, Pascal
    2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 3567 - 3571