A Mirror Descent-Based Algorithm for Corruption-Tolerant Distributed Gradient Descent

被引:0
|
作者
Wang, Shuche [1 ]
Tan, Vincent Y. F. [2 ,3 ]
机构
[1] Natl Univ Singapore, Inst Operat Res & Analyt, Singapore 117602, Singapore
[2] Natl Univ Singapore, Dept Math, Dept Elect & Comp Engn, Singapore 117602, Singapore
[3] Natl Univ Singapore, Inst Operat Res & Analyt, Singapore 117602, Singapore
关键词
Vectors; Servers; Noise measurement; Signal processing algorithms; Machine learning algorithms; Convergence; Uplink; Mirrors; Downlink; Machine learning; Distributed gradient descent; mirror descent; adversarial corruptions; noisy channels; convergence rates;
D O I
10.1109/TSP.2025.3539883
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Distributed gradient descent algorithms have come to the fore in modern machine learning, especially in parallelizing the handling of large datasets that are distributed across several workers. However, scant attention has been paid to analyzing the behavior of distributed gradient descent algorithms in the presence of adversarial corruptions instead of random noise. In this paper, we formulate a novel problem in which adversarial corruptions are present in a distributed learning system. We show how to use ideas from (lazy) mirror descent to design a corruption-tolerant distributed optimization algorithm. Extensive convergence analysis for (strongly) convex loss functions is provided for different choices of the stepsize. We carefully optimize the stepsize schedule to accelerate the convergence of the algorithm, while at the same time amortizing the effect of the corruption over time. Experiments based on linear regression, support vector classification, and softmax classification on the MNIST dataset corroborate our theoretical findings.
引用
收藏
页码:827 / 842
页数:16
相关论文
共 50 条
  • [1] Gradient descent-based robust adaptive beamforming
    Song, Xin
    Wang, Jinkuan
    Han, Yinghua
    DYNAMICS OF CONTINUOUS DISCRETE AND IMPULSIVE SYSTEMS-SERIES B-APPLICATIONS & ALGORITHMS, 2005, 12 : 444 - 456
  • [2] Distributed mirror descent algorithm over unbalanced digraphs based on gradient weighting technique
    Shi, Chong-Xiao
    Yang, Guang-Hong
    JOURNAL OF THE FRANKLIN INSTITUTE-ENGINEERING AND APPLIED MATHEMATICS, 2023, 360 (14): : 10656 - 10680
  • [3] Reparameterizing Mirror Descent as Gradient Descent
    Amid, Ehsan
    Warmuth, Manfred K.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [4] A Fractional Gradient Descent-Based RBF Neural Network
    Khan, Shujaat
    Naseem, Imran
    Malik, Muhammad Ammar
    Togneri, Roberto
    Bennamoun, Mohammed
    CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2018, 37 (12) : 5311 - 5332
  • [5] A Fractional Gradient Descent-Based RBF Neural Network
    Shujaat Khan
    Imran Naseem
    Muhammad Ammar Malik
    Roberto Togneri
    Mohammed Bennamoun
    Circuits, Systems, and Signal Processing, 2018, 37 : 5311 - 5332
  • [6] Distributed Randomized Gradient-Free Mirror Descent Algorithm for Constrained Optimization
    Yu, Zhan
    Ho, Daniel W. C.
    Yuan, Deming
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (02) : 957 - 964
  • [7] Fractional Gradient Descent-Based Auxiliary Model Algorithm for FIR Models with Missing Data
    Tang, Jia
    COMPLEXITY, 2023, 2023
  • [8] Gradient Descent-Based Parameter Estimation for Ultra Short Baseline
    Kim, Yongcheol
    Ko, Haklim
    Lee, Hojun
    IEEE ACCESS, 2024, 12 : 174713 - 174722
  • [9] Differentially private distributed online mirror descent algorithm
    Yuan, Meng
    Lei, Jinlong
    Hong, Yiguang
    NEUROCOMPUTING, 2023, 551
  • [10] An adaptive gradient descent-based local search in memetic algorithm applied to optimal controller design
    Arab, Aliasghar
    Alfi, Alireza
    INFORMATION SCIENCES, 2015, 299 : 117 - 142