A Distributed Stochastic Proximal-Gradient Algorithm for Composite Optimization

被引:7
|
作者
Niu, Youcheng [1 ]
Li, Huaqing [1 ]
Wang, Zheng [2 ]
Lu, Qingguo [1 ]
Xia, Dawen [3 ]
Ji, Lianghao [4 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing Key Lab Nonlinear Circuits & Intelligen, Chongqing 400715, Peoples R China
[2] Univ New South Wales, Sch Elect Engn & Telecommun, Sydney, NSW 2052, Australia
[3] Guizhou Minzu Univ, Coll Data Sci & Informat Engn, Guiyang 550025, Peoples R China
[4] Chongqing Univ Posts & Telecommun, Chongqing Key Lab Image Cognit, Chongqing 400065, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Linear convergence; linear convergence distributed composite optimization; machine learning; proximal-gradient method; stochastic averaging gradient (SAG); CONSENSUS; CONVERGENCE;
D O I
10.1109/TCNS.2021.3065653
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this article, we consider distributed composite optimization problems involving a common non-smooth regularization term over an undirected and connected network. Inspired by vast applications of this kind of problem in large-scale machine learning, the local cost function of each node is further set as an average of a certain amount of local cost subfunctions. For this scenario, most existing solutions based on the proximal method tend to ignore the cost of gradient evaluations, which results in degraded performance. We instead develop a distributed stochastic proximal-gradient algorithm to tackle the problems by employing the local unbiased stochastic averaging gradient method. At each iteration, only a single local cost subfunction is demanded by each node to evaluate the gradient, then the average of the latest stochastic gradients serves as the approximation of the true local gradient. An explicit linear convergence rate of the proposed algorithm is established with constant dual step-sizes for strongly convex local cost subfunctions with Lipschitz-continuous gradients. Furthermore, it is shown that, in the smooth cases, our simplified analysis technique can be extended to some notable primal-dual domain algorithms, such as DSA, EXTRA, and DIGing. Numerical experiments are presented to confirm the theoretical findings.
引用
收藏
页码:1383 / 1393
页数:11
相关论文
共 50 条
  • [21] Distributed Adaptive Gradient Algorithm With Gradient Tracking for Stochastic Nonconvex Optimization
    Han, Dongyu
    Liu, Kun
    Lin, Yeming
    Xia, Yuanqing
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2024, 69 (09) : 6333 - 6340
  • [22] ALTERNATING PROXIMAL-GRADIENT STEPS FOR (STOCHASTIC) NONCONVEX-CONCAVE MINIMAX PROBLEMS
    Bot, Radu Ioan
    Boehm, Axel
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2023, 33 (03) : 1884 - 1913
  • [23] Numerical experiments on stochastic block proximal-gradient type method for convex constrained optimization involving coordinatewise separable problems
    Promsinchai, Porntip
    Petrot, Narin
    [J]. CARPATHIAN JOURNAL OF MATHEMATICS, 2019, 35 (03) : 371 - 378
  • [24] Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Recovery
    Gu, Renliang
    Dogandzic, Aleksandar
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (13) : 3510 - 3525
  • [25] A MULTILEVEL PROXIMAL GRADIENT ALGORITHM FOR A CLASS OF COMPOSITE OPTIMIZATION PROBLEMS
    Parpas, Panos
    [J]. SIAM JOURNAL ON SCIENTIFIC COMPUTING, 2017, 39 (05): : S681 - S701
  • [26] An Inertial Proximal-Gradient Penalization Scheme for Constrained Convex Optimization Problems
    Boţ R.I.
    Csetnek E.R.
    Nimana N.
    [J]. Vietnam Journal of Mathematics, 2018, 46 (1) : 53 - 71
  • [27] Efficiency of Stochastic Coordinate Proximal Gradient Methods on Nonseparable Composite Optimization
    Necoara, Ion
    Chorobura, Flavia
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2024,
  • [28] Edge-Based Stochastic Gradient Algorithm for Distributed Optimization
    Wang, Zheng
    Li, Huaqing
    [J]. IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2020, 7 (03): : 1421 - 1430
  • [29] Golden Ratio Proximal Gradient ADMM for Distributed Composite Convex Optimization
    Yin, Chao
    Yang, Junfeng
    [J]. JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2024, 200 (03) : 895 - 922
  • [30] Proximal-gradient algorithms for fractional programming
    Bot, Radu Ioan
    Csetnek, Ernoe Robert
    [J]. OPTIMIZATION, 2017, 66 (08) : 1383 - 1396