An enhanced gradient-tracking bound for distributed online stochastic convex optimization

被引:0
|
作者
Alghunaim, Sulaiman A. [1 ]
Yuan, Kun [2 ]
机构
[1] Kuwait Univ, Dept Elect Engn, Kuwait, Kuwait
[2] Peking Univ, Ctr Machine Learning Res, Beijing, Peoples R China
关键词
Distributed stochastic optimization; Decentralized learning; Gradient-tracking; Adapt-then-combine; LINEAR CONVERGENCE; ALGORITHMS;
D O I
10.1016/j.sigpro.2023.109345
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Gradient-tracking (GT) based decentralized methods have emerged as an effective and viable alternative method to decentralized (stochastic) gradient descent (DSGD) when solving distributed online stochastic optimization problems. Initial studies of GT methods implied that GT methods have worse network dependent rate than DSGD, contradicting experimental results. This dilemma has recently been resolved, and tighter rates for GT methods have been established, which improves upon DSGD.In this work, we establish more enhanced rates for GT methods under the online stochastic convex settings. We present an alternative approach for analyzing GT methods for convex problems and over static graphs. When compared to previous analyses, this approach allows us to establish enhanced network dependent rates.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] ON THE PRIVACY OF NOISY STOCHASTIC GRADIENT DESCENT FOR CONVEX OPTIMIZATION
    Altschuler, Jason M.
    Bok, Jinho
    Talwar, Kunal
    SIAM JOURNAL ON COMPUTING, 2024, 53 (04) : 969 - 1001
  • [42] BLOCK STOCHASTIC GRADIENT ITERATION FOR CONVEX AND NONCONVEX OPTIMIZATION
    Xu, Yangyang
    Yin, Wotao
    SIAM JOURNAL ON OPTIMIZATION, 2015, 25 (03) : 1686 - 1716
  • [43] Online convex optimization in the bandit setting: gradient descent without a gradient
    Flaxman, Abraham D.
    Kalai, Adam Tauman
    McMahan, H. Brendan
    PROCEEDINGS OF THE SIXTEENTH ANNUAL ACM-SIAM SYMPOSIUM ON DISCRETE ALGORITHMS, 2005, : 385 - 394
  • [44] ON DISTRIBUTED STOCHASTIC GRADIENT ALGORITHMS FOR GLOBAL OPTIMIZATION
    Swenson, Brian
    Sridhar, Anirudh
    Poor, H. Vincent
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 8594 - 8598
  • [45] An Approximate Gradient Algorithm for Constrained Distributed Convex Optimization
    Yanqiong Zhang
    Youcheng Lou
    Yiguang Hong
    IEEE/CAA Journal of Automatica Sinica, 2014, 1 (01) : 61 - 67
  • [46] RANDOM GRADIENT EXTRAPOLATION FOR DISTRIBUTED AND STOCHASTIC OPTIMIZATION
    Lan, Guanghui
    Zhou, Yi
    SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (04) : 2753 - 2782
  • [47] Differential Privacy in Distributed Optimization With Gradient Tracking
    Huang, Lingying
    Wu, Junfeng
    Shi, Dawei
    Dey, Subhrakanti
    Shi, Ling
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2024, 69 (09) : 5727 - 5742
  • [48] Triggered Gradient Tracking for asynchronous distributed optimization
    Carnevale, Guido
    Notarnicola, Ivano
    Marconi, Lorenzo
    Notarstefano, Giuseppe
    AUTOMATICA, 2023, 147
  • [49] Distributed Stochastic Subgradient Projection Algorithms for Convex Optimization
    Ram, S. Sundhar
    Nedic, A.
    Veeravalli, V. V.
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2010, 147 (03) : 516 - 545
  • [50] Stochastic sensor scheduling via distributed convex optimization
    Li, Chong
    Elia, Nicola
    AUTOMATICA, 2015, 58 : 173 - 182