A DECENTRALIZED VARIANCE-REDUCED METHOD FOR STOCHASTIC OPTIMIZATION OVER DIRECTED GRAPHS

被引:1
|
作者
Qureshi, Muhammad, I [1 ]
Xin, Ran [2 ]
Kar, Soummya [2 ]
Khan, Usman A. [1 ]
机构
[1] Tufts Univ, Medford, MA 02155 USA
[2] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
基金
美国国家科学基金会;
关键词
Stochastic optimization; first-order methods; variance reduction; decentralized algorithms; directed graphs; DISTRIBUTED OPTIMIZATION;
D O I
10.1109/ICASSP39728.2021.9413600
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
In this paper, we propose a decentralized first-order stochastic optimization method Push-SAGA for finite-sum minimization over a strongly connected directed graph. This method features local variance reduction to remove the uncertainty caused by random sampling of the local gradients, global gradient tracking to address the distributed nature of the data, and push-sum consensus to handle the imbalance caused by the directed nature of the underlying graph. We show that, for a sufficiently small step-size, Push-SAGA linearly converges to the optimal solution for smooth and strongly convex problems, making it the first linearly-convergent stochastic algorithm over arbitrary strongly-connected directed graphs. We illustrate the behavior and convergence properties of Push-SAGA with the help of numerical experiments for strongly convex and non-convex problems.
引用
收藏
页码:5030 / 5034
页数:5
相关论文
共 50 条
  • [21] Accelerating variance-reduced stochastic gradient methods
    Derek Driggs
    Matthias J. Ehrhardt
    Carola-Bibiane Schönlieb
    Mathematical Programming, 2022, 191 : 671 - 715
  • [22] Accelerating variance-reduced stochastic gradient methods
    Driggs, Derek
    Ehrhardt, Matthias J.
    Schonlieb, Carola-Bibiane
    MATHEMATICAL PROGRAMMING, 2022, 191 (02) : 671 - 715
  • [23] Stochastic Variance-Reduced Cubic Regularization Methods
    Zhou, Dongruo
    Xu, Pan
    Gu, Quanquan
    JOURNAL OF MACHINE LEARNING RESEARCH, 2019, 20
  • [24] Stochastic variance-reduced cubic regularization methods
    Zhou, Dongruo
    Xu, Pan
    Gu, Quanquan
    Journal of Machine Learning Research, 2019, 20
  • [25] Stochastic variance-reduced prox-linear algorithms for nonconvex composite optimization
    Junyu Zhang
    Lin Xiao
    Mathematical Programming, 2022, 195 : 649 - 691
  • [26] Stochastic variance-reduced prox-linear algorithms for nonconvex composite optimization
    Zhang, Junyu
    Xiao, Lin
    MATHEMATICAL PROGRAMMING, 2022, 195 (1-2) : 649 - 691
  • [27] A decentralized Nesterov gradient method for stochastic optimization over unbalanced directed networks
    Hu, Jinhui
    Xia, Dawen
    Cheng, Huqiang
    Feng, Liping
    Ji, Lianghao
    Guo, Jing
    Li, Huaqing
    ASIAN JOURNAL OF CONTROL, 2022, 24 (02) : 576 - 593
  • [28] Stochastic Variance-Reduced Cubic Regularized Newton Methods
    Zhou, Dongruo
    Xu, Pan
    Gu, Quanquan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [29] Stochastic Recursive Variance-Reduced Cubic Regularization Methods
    Zhou, Dongruo
    Gu, Quanquan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 3980 - 3989
  • [30] S-ADDOPT: Decentralized Stochastic First-Order Optimization Over Directed Graphs
    Qureshi, Muhammad, I
    Xin, Ran
    Kar, Soummya
    Khan, Usman A.
    IEEE CONTROL SYSTEMS LETTERS, 2021, 5 (03): : 953 - 958