A Distributed Stochastic Proximal-Gradient Algorithm for Composite Optimization

被引:7
|
作者
Niu, Youcheng [1 ]
Li, Huaqing [1 ]
Wang, Zheng [2 ]
Lu, Qingguo [1 ]
Xia, Dawen [3 ]
Ji, Lianghao [4 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing Key Lab Nonlinear Circuits & Intelligen, Chongqing 400715, Peoples R China
[2] Univ New South Wales, Sch Elect Engn & Telecommun, Sydney, NSW 2052, Australia
[3] Guizhou Minzu Univ, Coll Data Sci & Informat Engn, Guiyang 550025, Peoples R China
[4] Chongqing Univ Posts & Telecommun, Chongqing Key Lab Image Cognit, Chongqing 400065, Peoples R China
来源
基金
中国国家自然科学基金;
关键词
Linear convergence; linear convergence distributed composite optimization; machine learning; proximal-gradient method; stochastic averaging gradient (SAG); CONSENSUS; CONVERGENCE;
D O I
10.1109/TCNS.2021.3065653
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this article, we consider distributed composite optimization problems involving a common non-smooth regularization term over an undirected and connected network. Inspired by vast applications of this kind of problem in large-scale machine learning, the local cost function of each node is further set as an average of a certain amount of local cost subfunctions. For this scenario, most existing solutions based on the proximal method tend to ignore the cost of gradient evaluations, which results in degraded performance. We instead develop a distributed stochastic proximal-gradient algorithm to tackle the problems by employing the local unbiased stochastic averaging gradient method. At each iteration, only a single local cost subfunction is demanded by each node to evaluate the gradient, then the average of the latest stochastic gradients serves as the approximation of the true local gradient. An explicit linear convergence rate of the proposed algorithm is established with constant dual step-sizes for strongly convex local cost subfunctions with Lipschitz-continuous gradients. Furthermore, it is shown that, in the smooth cases, our simplified analysis technique can be extended to some notable primal-dual domain algorithms, such as DSA, EXTRA, and DIGing. Numerical experiments are presented to confirm the theoretical findings.
引用
收藏
页码:1383 / 1393
页数:11
相关论文
共 50 条
  • [1] A Delay-tolerant Proximal-Gradient Algorithm for Distributed Learning
    Mishchenko, Konstantin
    Iutzeler, Franck
    Malick, Jerome
    Amini, Massih-Reza
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [2] DISTRIBUTED PROXIMAL-GRADIENT METHOD FOR CONVEX OPTIMIZATION WITH INEQUALITY CONSTRAINTS
    Li, Jueyou
    Wu, Changzhi
    Wu, Zhiyou
    Long, Qiang
    Wang, Xiangyu
    [J]. ANZIAM JOURNAL, 2014, 56 (02): : 160 - 178
  • [3] A Fast Distributed Proximal-Gradient Method
    Chen, Annie I.
    Ozdaglar, Asuman
    [J]. 2012 50TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2012, : 601 - 608
  • [4] A proximal-gradient algorithm for crystal surface evolution
    Katy Craig
    Jian-Guo Liu
    Jianfeng Lu
    Jeremy L. Marzuola
    Li Wang
    [J]. Numerische Mathematik, 2022, 152 : 631 - 662
  • [5] A proximal-gradient algorithm for crystal surface evolution
    Craig, Katy
    Liu, Jian-Guo
    Lu, Jianfeng
    Marzuola, Jeremy L.
    Wang, Li
    [J]. NUMERISCHE MATHEMATIK, 2022, 152 (03) : 631 - 662
  • [6] Stochastic proximal-gradient algorithms for penalized mixed models
    Fort, Gersende
    Ollier, Edouard
    Samson, Adeline
    [J]. STATISTICS AND COMPUTING, 2019, 29 (02) : 231 - 253
  • [7] Stochastic proximal-gradient algorithms for penalized mixed models
    Gersende Fort
    Edouard Ollier
    Adeline Samson
    [J]. Statistics and Computing, 2019, 29 : 231 - 253
  • [8] An Edge-based Stochastic Proximal Gradient Algorithm for Decentralized Composite Optimization
    Zhang, Ling
    Yan, Yu
    Wang, Zheng
    Li, Huaqing
    [J]. INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2021, 19 (11) : 3598 - 3610
  • [9] An Edge-based Stochastic Proximal Gradient Algorithm for Decentralized Composite Optimization
    Ling Zhang
    Yu Yan
    Zheng Wang
    Huaqing Li
    [J]. International Journal of Control, Automation and Systems, 2021, 19 : 3598 - 3610
  • [10] A Proximal Gradient Algorithm for Decentralized Composite Optimization
    Shi, Wei
    Ling, Qing
    Wu, Gang
    Yin, Wotao
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2015, 63 (22) : 6013 - 6023