Distributed zeroth-order optimization: Convergence rates that match centralized counterpart

被引:4
|
作者
Yuan, Deming [1 ]
Wang, Lei [2 ]
Proutiere, Alexandre [3 ]
Shi, Guodong [4 ]
机构
[1] Nanjing Univ Sci & Technol, Sch Automat, Nanjing 210094, Jiangsu, Peoples R China
[2] Zhejiang Univ, Coll Control Sci & Engn, State Key Lab Ind Control Technol, Hangzhou 310058, Peoples R China
[3] KTH Royal Inst Technol, Dept Automat Control, SE-10044 Stockholm, Sweden
[4] Univ Sydney, Sydney Inst Robot & Intelligent Syst, Australian Ctr Field Robot, Sydney, NSW 2006, Australia
基金
澳大利亚研究理事会; 中国国家自然科学基金;
关键词
Distributed optimization; Zeroth-order optimization; Multi-stage optimization algorithm; Optimal convergence rate; MULTIAGENT OPTIMIZATION; CONVEX-OPTIMIZATION; SUBGRADIENT METHODS; ALGORITHM; POWER;
D O I
10.1016/j.automatica.2023.111328
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Zeroth-order optimization has become increasingly important in complex optimization and machine learning when cost functions are impossible to be described in closed analytical forms. The key idea of zeroth-order optimization lies in the ability for a learner to build gradient estimates by queries sent to the cost function, and then traditional gradient descent algorithms can be executed replacing gradients by the estimates. For optimization over large-scale multi-agent systems with decentralized data and costs, zeroth-order optimization can continue to be utilized to develop scalable and distributed algorithms. In this paper, we aim at understanding the trend in performance transitioning from centralized to distributed zeroth-order algorithms in terms of convergence rates, and focus on multi -agent systems with time-varying communication networks. We establish a series of convergence rates for distributed zeroth-order subgradient algorithms under both one-point and two-point zeroth-order oracles. Apart from the additional node-to-node communication cost due to the distributed nature of algorithms, the established rates in convergence are shown to match their centralized counterpart. We also propose a multi-stage distributed zeroth-order algorithm that better utilizes the learning rates, reduces the computational complexity, and attains even faster convergence rates for compact decision set.(c) 2023 Elsevier Ltd. All rights reserved.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Asynchronous Zeroth-Order Distributed Optimization with Residual Feedback
    Shen, Yi
    Zhang, Yan
    Nivison, Scott
    Bell, Zachary, I
    Zavlanos, Michael M.
    2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2021, : 3349 - 3354
  • [2] Zeroth-order algorithms for stochastic distributed nonconvex optimization
    Yi, Xinlei
    Zhang, Shengjun
    Yang, Tao
    Johansson, Karl H.
    AUTOMATICA, 2022, 142
  • [3] Zeroth-Order Method for Distributed Optimization With Approximate Projections
    Yuan, Deming
    Ho, Daniel W. C.
    Xu, Shengyuan
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2016, 27 (02) : 284 - 294
  • [4] Distributed zeroth-order online optimization with communication delays
    Inoue, Keito
    Hayashi, Naoki
    Takai, Shigemasa
    IET CONTROL THEORY AND APPLICATIONS, 2025, 19 (01):
  • [5] Zeroth-order Gradient Tracking for Distributed Constrained Optimization
    Cheng, Songsong
    Yu, Xin
    Fan, Yuan
    Xiao, Gaoxi
    IFAC PAPERSONLINE, 2023, 56 (02): : 5197 - 5202
  • [6] On the Convergence of Prior-Guided Zeroth-Order Optimization Algorithms
    Cheng, Shuyu
    Wu, Guoqiang
    Zhu, Jun
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] A zeroth-order algorithm for distributed optimization with stochastic stripe observations
    Wang, Yinghui
    Zeng, Xianlin
    Zhao, Wenxiao
    Hong, Yiguang
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (09)
  • [8] A zeroth-order algorithm for distributed optimization with stochastic stripe observations
    Yinghui WANG
    Xianlin ZENG
    Wenxiao ZHAO
    Yiguang HONG
    ScienceChina(InformationSciences), 2023, 66 (09) : 297 - 298
  • [9] Convergence Analysis of Nonconvex Distributed Stochastic Zeroth-order Coordinate Method
    Zhang, Shengjun
    Dong, Yunlong
    Xie, Dong
    Yao, Lisha
    Bailey, Colleen P.
    Fu, Shengli
    2021 60TH IEEE CONFERENCE ON DECISION AND CONTROL (CDC), 2021, : 1180 - 1185
  • [10] Compressed Distributed Zeroth-Order Gradient Tracking for Nonconvex Optimization
    Xu, Lei
    Yi, Xinlei
    Sun, Jiayue
    Wen, Guanghui
    Chai, Tianyou
    Yang, Tao
    2024 IEEE 18TH INTERNATIONAL CONFERENCE ON CONTROL & AUTOMATION, ICCA 2024, 2024, : 603 - 608