SUCAG: Stochastic Unbiased Curvature-aided Gradient Method for Distributed Optimization

被引:0
|
作者
Wai, Hoi-To [1 ]
Freris, Nikolaos M. [2 ,3 ]
Nedic, Angelia [1 ]
Scaglione, Anna [1 ]
机构
[1] Arizona State Univ, Sch Elect Comp & Energy Engn, Tempe, AZ 85281 USA
[2] New York Univ Abu Dhabi, Div Engn, Abu Dhabi, U Arab Emirates
[3] NYU, Tandon Sch Engn, Brooklyn, NY USA
关键词
Distributed optimization; Incremental methods; Asynchronous algorithms; Randomized algorithms; Multi-agent systems; Machine learning; SUBGRADIENT METHODS; CLOCKS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We propose and analyze a new stochastic gradient method, which we call Stochastic Unbiased Curvature-aided Gradient (SUCAG), for finite sum optimization problems. SUCAG constitutes an unbiased total gradient tracking technique that uses Hessian information to accelerate convergence. We analyze our method under the general asynchronous model of computation, in which each function is selected infinitely often with possibly unbounded (but sublinear) delay. For strongly convex problems, we establish linear convergence for the SUCAG method. When the initialization point is sufficiently close to the optimal solution, the established convergence rate is only dependent on the condition number of the problem, making it strictly faster than the known rate for the SAGA method. Furthermore, we describe a Markov-driven approach of implementing the SUCAG method in a distributed asynchronous multi-agent setting, via gossiping along a random walk on an undirected communication graph. We show that our analysis applies as long as the graph is connected and, notably, establishes an asymptotic linear convergence rate that is robust to the graph topology. Numerical results demonstrate the merits of our algorithm over existing methods.
引用
收藏
页码:1751 / 1756
页数:6
相关论文
共 50 条
  • [31] Stochastic intermediate gradient method for convex optimization problems
    Gasnikov, A. V.
    Dvurechensky, P. E.
    DOKLADY MATHEMATICS, 2016, 93 (02) : 148 - 151
  • [32] Stochastic Gradient Tracking Methods for Distributed Personalized Optimization over Networks
    Huang, Yan
    Xu, Jinming
    Meng, Wenchao
    Wai, Hoi-To
    2022 IEEE 61ST CONFERENCE ON DECISION AND CONTROL (CDC), 2022, : 4571 - 4578
  • [33] S-DIGing: A Stochastic Gradient Tracking Algorithm for Distributed Optimization
    Li, Huaqing
    Zheng, Lifeng
    Wang, Zheng
    Yan, Yu
    Feng, Liping
    Guo, Jing
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2022, 6 (01): : 53 - 65
  • [34] A Stochastic Gradient-Based Projection Algorithm for Distributed Constrained Optimization
    Zhang, Keke
    Gao, Shanfu
    Chen, Yingjue
    Zheng, Zuqing
    Lu, Qingguo
    NEURAL INFORMATION PROCESSING, ICONIP 2023, PT I, 2024, 14447 : 356 - 367
  • [35] Online Distributed Stochastic Gradient Algorithm for Nonconvex Optimization With Compressed Communication
    Li, Jueyou
    Li, Chaojie
    Fan, Jing
    Huang, Tingwen
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2024, 69 (02) : 936 - 951
  • [36] On the Convergence of the Conditional Gradient Method in Distributed Optimization Problems
    Chernov, A. V.
    COMPUTATIONAL MATHEMATICS AND MATHEMATICAL PHYSICS, 2011, 51 (09) : 1510 - 1523
  • [37] An Accelerated Distributed Conditional Gradient Method for Online Optimization
    Shen, Xiuyu
    Li, Dequan
    Dong, Qiao
    Xue, Sheng
    2019 11TH INTERNATIONAL CONFERENCE ON INTELLIGENT HUMAN-MACHINE SYSTEMS AND CYBERNETICS (IHMSC 2019), VOL 2, 2019, : 29 - 32
  • [38] On the convergence of the conditional gradient method in distributed optimization problems
    A. V. Chernov
    Computational Mathematics and Mathematical Physics, 2011, 51 : 1510 - 1523
  • [39] Statistically Preconditioned Accelerated Gradient Method for Distributed Optimization
    Hendrikx, Hadrien
    Xiao, Lin
    Bubeck, Sebastien
    Bach, Francis
    Massoulie, Laurent
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [40] Gradient-free method for nonsmooth distributed optimization
    Li, Jueyou
    Wu, Changzhi
    Wu, Zhiyou
    Long, Qiang
    JOURNAL OF GLOBAL OPTIMIZATION, 2015, 61 (02) : 325 - 340