Convergence of Distributed Gradient-Tracking-Based Optimization Algorithms with Random Graphs

被引:0
|
作者
WANG Jiexiang [1 ]
FU Keli [2 ]
GU Yu [2 ]
LI Tao [2 ]
机构
[1] School of Mechatronic Engineering and Automation, Shanghai University
[2] Key Laboratory of Pure Mathematics and Mathematical Practice, School of Mathematical Sciences, East China Normal University
关键词
Distributed optimization; geometric convergence; gradient tracking; random graph;
D O I
暂无
中图分类号
O224 [最优化的数学理论];
学科分类号
070105 ; 1201 ;
摘要
This paper studies distributed convex optimization over a multi-agent system, where each agent owns only a local cost function with convexity and Lipschitz continuous gradients. The goal of the agents is to cooperatively minimize a sum of the local cost functions. The underlying communication networks are modelled by a sequence of random and balanced digraphs, which are not required to be spatially or temporally independent and have any special distributions. The authors use a distributed gradient-tracking-based optimization algorithm to solve the optimization problem. In the algorithm,each agent makes an estimate of the optimal solution and an estimate of the average of all the local gradients. The values of the estimates are updated based on a combination of a consensus method and a gradient tracking method. The authors prove that the algorithm can achieve convergence to the optimal solution at a geometric rate if the conditional graphs are uniformly strongly connected, the global cost function is strongly convex and the step-sizes don’t exceed some upper bounds.
引用
收藏
页码:1438 / 1453
页数:16
相关论文
共 50 条
  • [21] Differential Privacy in Distributed Optimization With Gradient Tracking
    Huang, Lingying
    Wu, Junfeng
    Shi, Dawei
    Dey, Subhrakanti
    Shi, Ling
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2024, 69 (09) : 5727 - 5742
  • [22] Triggered Gradient Tracking for asynchronous distributed optimization
    Carnevale, Guido
    Notarnicola, Ivano
    Marconi, Lorenzo
    Notarstefano, Giuseppe
    AUTOMATICA, 2023, 147
  • [23] Almost sure convergence of random projected proximal and subgradient algorithms for distributed nonsmooth convex optimization
    Iiduka, Hideaki
    OPTIMIZATION, 2017, 66 (01) : 35 - 59
  • [24] Convergence Rate of Push-Sum Algorithms on Random Graphs
    Rezaienia, Pouya
    Gharesifard, Bahman
    Linder, Tamas
    Touri, Behrouz
    2018 IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2018, : 4218 - 4223
  • [25] RANDOM GRADIENT EXTRAPOLATION FOR DISTRIBUTED AND STOCHASTIC OPTIMIZATION
    Lan, Guanghui
    Zhou, Yi
    SIAM JOURNAL ON OPTIMIZATION, 2018, 28 (04) : 2753 - 2782
  • [26] Gradient-tracking based differentially private distributed optimization with enhanced optimization accuracy✩
    Xuan, Yu
    Wang, Yongqiang
    AUTOMATICA, 2023, 155
  • [27] DISTRIBUTED NESTEROV GRADIENT METHODS FOR RANDOM NETWORKS: CONVERGENCE IN PROBABILITY AND CONVERGENCE RATES
    Jakovetic, Dusan
    Xavier, Joao
    Moura, Jose M. F.
    2014 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2014,
  • [28] Speed and convergence properties of gradient algorithms for optimization of IMRT
    Zhang, XD
    Liu, H
    Wang, XC
    Dong, L
    Wu, QW
    Mohan, R
    MEDICAL PHYSICS, 2004, 31 (05) : 1141 - 1152
  • [29] Gradient algorithms for quadratic optimization with fast convergence rates
    Pronzato, Luc
    Zhigljavsky, Anatoly
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2011, 50 (03) : 597 - 617
  • [30] Gradient algorithms for quadratic optimization with fast convergence rates
    Luc Pronzato
    Anatoly Zhigljavsky
    Computational Optimization and Applications, 2011, 50 : 597 - 617