A Mirror Descent-Based Algorithm for Corruption-Tolerant Distributed Gradient Descent

被引:0
|
作者
Wang, Shuche [1 ]
Tan, Vincent Y. F. [2 ,3 ]
机构
[1] Natl Univ Singapore, Inst Operat Res & Analyt, Singapore 117602, Singapore
[2] Natl Univ Singapore, Dept Math, Dept Elect & Comp Engn, Singapore 117602, Singapore
[3] Natl Univ Singapore, Inst Operat Res & Analyt, Singapore 117602, Singapore
关键词
Vectors; Servers; Noise measurement; Signal processing algorithms; Machine learning algorithms; Convergence; Uplink; Mirrors; Downlink; Machine learning; Distributed gradient descent; mirror descent; adversarial corruptions; noisy channels; convergence rates;
D O I
10.1109/TSP.2025.3539883
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Distributed gradient descent algorithms have come to the fore in modern machine learning, especially in parallelizing the handling of large datasets that are distributed across several workers. However, scant attention has been paid to analyzing the behavior of distributed gradient descent algorithms in the presence of adversarial corruptions instead of random noise. In this paper, we formulate a novel problem in which adversarial corruptions are present in a distributed learning system. We show how to use ideas from (lazy) mirror descent to design a corruption-tolerant distributed optimization algorithm. Extensive convergence analysis for (strongly) convex loss functions is provided for different choices of the stepsize. We carefully optimize the stepsize schedule to accelerate the convergence of the algorithm, while at the same time amortizing the effect of the corruption over time. Experiments based on linear regression, support vector classification, and softmax classification on the MNIST dataset corroborate our theoretical findings.
引用
收藏
页码:827 / 842
页数:16
相关论文
共 50 条
  • [31] Algorithm for Data Balancing Based on Gradient Descent
    Mukhin, A., V
    Kilbas, I. A.
    Paringer, R. A.
    Ilyasova, N. Yu
    Kupriyanov, A., V
    PROCEEDINGS OF THE 2ND INTERNATIONAL CONFERENCE ON ADVANCES IN SIGNAL PROCESSING AND ARTIFICIAL INTELLIGENCE, ASPAI' 2020, 2020, : 56 - 59
  • [32] Quantized Gradient-Descent Algorithm for Distributed Resource Allocation
    Zhou, Hongbing
    Yu, Weiyong
    Yi, Peng
    Hong, Yiguang
    UNMANNED SYSTEMS, 2019, 7 (02) : 119 - 136
  • [33] Byzantine Fault Tolerant Distributed Stochastic Gradient Descent Based on Over-the-Air Computation
    Park, Sangjun
    Choi, Wan
    IEEE TRANSACTIONS ON COMMUNICATIONS, 2022, 70 (05) : 3204 - 3219
  • [34] Distributed Kernel-Based Gradient Descent Algorithms
    Shao-Bo Lin
    Ding-Xuan Zhou
    Constructive Approximation, 2018, 47 : 249 - 276
  • [35] A gradient descent algorithm for LASSO
    Kim, Yongdai
    Kim, Yuwon
    Kim, Jinseog
    PREDICTION AND DISCOVERY, 2007, 443 : 73 - 82
  • [36] Reinforcement learning with constraint based on mirror descent algorithm
    Miyashita, Megumi
    Kondo, Toshiyuki
    Yano, Shiro
    RESULTS IN CONTROL AND OPTIMIZATION, 2021, 4
  • [37] Distributed Kernel-Based Gradient Descent Algorithms
    Lin, Shao-Bo
    Zhou, Ding-Xuan
    CONSTRUCTIVE APPROXIMATION, 2018, 47 (02) : 249 - 276
  • [38] A Code-Based Distributed Gradient Descent Method
    Atallah, Elie
    Rahnavard, Nazanin
    2018 56TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2018, : 951 - 958
  • [39] Distributed Mirror Descent Algorithm With Bregman Damping for Nonsmooth Constrained Optimization
    Chen, Guanpu
    Xu, Gehui
    Li, Weijian
    Hong, Yiguang
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2023, 68 (11) : 6921 - 6928
  • [40] Adaptive quantized online distributed mirror descent algorithm with Bandit feedback
    Xie J.-R.
    Gao W.-H.
    Xie Y.-B.
    Kongzhi Lilun Yu Yingyong/Control Theory and Applications, 2023, 40 (10): : 1774 - 1782