Edge-Based Stochastic Gradient Algorithm for Distributed Optimization

被引:33
|
作者
Wang, Zheng [1 ]
Li, Huaqing [1 ]
机构
[1] Southwest Univ, Coll Elect & Informat Engn, Chongqing Key Lab Nonlinear Circuits & Intelligen, Chongqing 400715, Peoples R China
基金
中国国家自然科学基金;
关键词
Convergence; Optimization; Linear programming; Convex functions; Laplace equations; Machine learning; Training; Distributed convex optimization; machine learning; augmented Lagrange; stochastic averaging gradient; ALTERNATING DIRECTION METHOD; MULTIAGENT NETWORKS; CONVEX-OPTIMIZATION; SUBGRADIENT METHODS; LINEAR CONVERGENCE; CONSENSUS;
D O I
10.1109/TNSE.2019.2933177
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
This paper investigates distributed optimization problems where a group of networked nodes collaboratively minimizes the sum of all local objective functions. The local objective function of each node is further set as an average of a finite set of subfunctions. This adjustment is motivated by machine learning problems with large training samples distributed and known privately to individual computational nodes. An augmented Lagrange (AL) stochastic gradient algorithm is presented to address the distributed optimization problem, which is integrated with the factorization of weighted Laplacian and local unbiased stochastic averaging gradient methods. At each iteration, only one randomly selected gradient of a subfunction is evaluated at a node, and a variance-reduced stochastic averaging gradient technique is applied to approximate the gradient of local objective function. Strong convexity of the local subfunction and Lipschitz continuity of its gradient are shown to ensure a linear convergence rate of the proposed algorithm in expectation. Numerical experiments on a logistic regression problem demonstrate the correctness of theoretical results.
引用
收藏
页码:1421 / 1430
页数:10
相关论文
共 50 条
  • [1] An Edge-based Stochastic Proximal Gradient Algorithm for Decentralized Composite Optimization
    Zhang, Ling
    Yan, Yu
    Wang, Zheng
    Li, Huaqing
    [J]. INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2021, 19 (11) : 3598 - 3610
  • [2] An Edge-based Stochastic Proximal Gradient Algorithm for Decentralized Composite Optimization
    Ling Zhang
    Yu Yan
    Zheng Wang
    Huaqing Li
    [J]. International Journal of Control, Automation and Systems, 2021, 19 : 3598 - 3610
  • [3] Edge-Based Communication Optimization for Distributed Federated Learning
    Wang, Tian
    Liu, Yan
    Zheng, Xi
    Dai, Hong-Ning
    Jia, Weijia
    Xie, Mande
    [J]. IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2022, 9 (04): : 2015 - 2024
  • [4] A Stochastic Gradient-Based Projection Algorithm for Distributed Constrained Optimization
    Zhang, Keke
    Gao, Shanfu
    Chen, Yingjue
    Zheng, Zuqing
    Lu, Qingguo
    [J]. NEURAL INFORMATION PROCESSING, ICONIP 2023, PT I, 2024, 14447 : 356 - 367
  • [5] Distributed Adaptive Gradient Algorithm With Gradient Tracking for Stochastic Nonconvex Optimization
    Han, Dongyu
    Liu, Kun
    Lin, Yeming
    Xia, Yuanqing
    [J]. IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2024, 69 (09) : 6333 - 6340
  • [6] An Edge-based Distributed Algorithm for Economic Dispatch in Power Systems
    Zheng, Lifeng
    Du, Zhenyuan
    Li, Huaqing
    [J]. 2020 IEEE 16TH INTERNATIONAL CONFERENCE ON CONTROL & AUTOMATION (ICCA), 2020, : 883 - 888
  • [7] Distributed zero-gradient-sum optimisation algorithm with an edge-based adaptive event-triggered mechanism
    Liu, Jiayun
    [J]. INTERNATIONAL JOURNAL OF CONTROL, 2024,
  • [8] Distributed gradient descent method with edge-based event-driven communication for non-convex optimization
    Adachi, T.
    Hayashi, N.
    Takai, S.
    [J]. IET CONTROL THEORY AND APPLICATIONS, 2021, 15 (12): : 1588 - 1598
  • [9] A Distributed Stochastic Proximal-Gradient Algorithm for Composite Optimization
    Niu, Youcheng
    Li, Huaqing
    Wang, Zheng
    Lu, Qingguo
    Xia, Dawen
    Ji, Lianghao
    [J]. IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2021, 8 (03): : 1383 - 1393
  • [10] Stochastic Strongly Convex Optimization via Distributed Epoch Stochastic Gradient Algorithm
    Yuan, Deming
    Ho, Daniel W. C.
    Xu, Shengyuan
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2021, 32 (06) : 2344 - 2357