Linear Convergence of First- and Zeroth-Order Primal-Dual Algorithms for Distributed Nonconvex Optimization

被引:15
|
作者
Yi, Xinlei [1 ,2 ]
Zhang, Shengjun [3 ]
Yang, Tao [4 ]
Chai, Tianyou [4 ]
Johansson, Karl H. [1 ,2 ]
机构
[1] KTH Royal Inst Technol, Sch Elect Engn & Comp Sci, Div Decis & Control Syst, S-11428 Stockholm, Sweden
[2] Digital Futures, S-11428 Stockholm, Sweden
[3] Univ North Texas, Dept Elect Engn, Denton, TX 76203 USA
[4] Northeastern Univ, State Key Lab Synthet Automat Proc Ind, Shenyang 110819, Peoples R China
基金
中国国家自然科学基金; 瑞典研究理事会;
关键词
Convergence; Cost function; Convex functions; Costs; Technological innovation; Lyapunov methods; Laplace equations; Distributed nonconvex optimization; first-order algorithm; linear convergence; primal-dual algorithm; zeroth-order algorithm; CONVEX-OPTIMIZATION; MULTIAGENT OPTIMIZATION; CONSENSUS; ADMM;
D O I
10.1109/TAC.2021.3108501
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This article considers the distributed nonconvex optimization problem of minimizing a global cost function formed by a sum of local cost functions by using local information exchange. We first consider a distributed first-order primal-dual algorithm. We show that it converges sublinearly to a stationary point if each local cost function is smooth and linearly to a global optimum under an additional condition that the global cost function satisfies the Polyak-Lojasiewicz condition. This condition is weaker than strong convexity, which is a standard condition for proving linear convergence of distributed optimization algorithms, and the global minimizer is not necessarily unique. Motivated by the situations where the gradients are unavailable, we then propose a distributed zeroth-order algorithm, derived from the considered first-order algorithm by using a deterministic gradient estimator, and show that it has the same convergence properties as the considered first-order algorithm under the same conditions. The theoretical results are illustrated by numerical simulations.
引用
收藏
页码:4194 / 4201
页数:8
相关论文
共 50 条
  • [21] Communication Efficient Primal-Dual Algorithm for Nonconvex Nonsmooth Distributed Optimization
    Chen, Congliang
    Zhang, Jiawei
    Shen, Li
    Zhao, Peilin
    Luo, Zhi-Quan
    [J]. 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
  • [22] Distributed zeroth-order optimization: Convergence rates that match centralized counterpart
    Yuan, Deming
    Wang, Lei
    Proutiere, Alexandre
    Shi, Guodong
    [J]. AUTOMATICA, 2024, 159
  • [23] Zeroth-Order Stochastic Variance Reduction for Nonconvex Optimization
    Liu, Sijia
    Kailkhura, Bhavya
    Chen, Pin-Yu
    Ting, Paishun
    Chang, Shiyu
    Amini, Lisa
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [24] A First-Order Primal-Dual Method for Nonconvex Constrained Optimization Based on the Augmented Lagrangian
    Zhu, Daoli
    Zhao, Lei
    Zhang, Shuzhong
    [J]. MATHEMATICS OF OPERATIONS RESEARCH, 2024, 49 (01) : 125 - 150
  • [25] Zeroth-order single-loop algorithms for nonconvex-linear minimax problems
    Jingjing Shen
    Ziqi Wang
    Zi Xu
    [J]. Journal of Global Optimization, 2023, 87 : 551 - 580
  • [26] Zeroth-order single-loop algorithms for nonconvex-linear minimax problems
    Shen, Jingjing
    Wang, Ziqi
    Xu, Zi
    [J]. JOURNAL OF GLOBAL OPTIMIZATION, 2023, 87 (2-4) : 551 - 580
  • [27] Zeroth-order single-loop algorithms for nonconvex-linear minimax problems
    Shen, Jingjing
    Wang, Ziqi
    Xu, Zi
    [J]. JOURNAL OF GLOBAL OPTIMIZATION, 2022,
  • [28] Resilient Primal-Dual Optimization Algorithms for Distributed Resource Allocation
    Turan, Berkay
    Uribe, Cesar A.
    Wai, Hoi-To
    Alizadeh, Mahnoosh
    [J]. IEEE TRANSACTIONS ON CONTROL OF NETWORK SYSTEMS, 2021, 8 (01): : 282 - 294
  • [29] NON-STATIONARY FIRST-ORDER PRIMAL-DUAL ALGORITHMS WITH FASTER CONVERGENCE RATES
    Tran-Dinh, Quoc
    Zhu, Yuzixuan
    [J]. SIAM JOURNAL ON OPTIMIZATION, 2020, 30 (04) : 2866 - 2896
  • [30] Zeroth-Order Methods for Nondifferentiable, Nonconvex, and Hierarchical Federated Optimization
    Qiu, Yuyang
    Shanbhag, Uday V.
    Yousefian, Farzad
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,