Zeroth-Order Method for Distributed Optimization With Approximate Projections

被引:41
|
作者
Yuan, Deming [1 ]
Ho, Daniel W. C. [2 ,3 ]
Xu, Shengyuan [3 ]
机构
[1] Nanjing Univ Posts & Telecommun, Coll Automat, Nanjing 210023, Jiangsu, Peoples R China
[2] City Univ Hong Kong, Dept Math, Hong Kong, Hong Kong, Peoples R China
[3] Nanjing Univ Sci & Technol, Sch Automat, Nanjing 210094, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Approximate projection; convex optimization; distributed optimization; gradient estimator; networked systems; SUBGRADIENT METHODS; MULTIAGENT OPTIMIZATION; CONVEX-OPTIMIZATION; CONSENSUS; ALGORITHMS; NETWORKS; SYSTEMS; AGENTS;
D O I
10.1109/TNNLS.2015.2480419
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper studies the problem of minimizing a sum of (possible nonsmooth) convex functions that are corresponding to multiple interacting nodes, subject to a convex state constraint set. Time-varying directed network is considered here. Two types of computational constraints are investigated in this paper: one where the information of gradients is not available and the other where the projection steps can only be calculated approximately. We devise a distributed zeroth-order method, the implementation of which needs only functional evaluations and approximate projection. In particular, we show that the proposed method generates expected function value sequences that converge to the optimal value, provided that the projection errors decrease at appropriate rates.
引用
收藏
页码:284 / 294
页数:11
相关论文
共 50 条
  • [31] Zeroth-Order Optimization for Composite Problems with Functional Constraints
    Li, Zichong
    Chen, Pin-Yu
    Liu, Sijia
    Lu, Songtao
    Xu, Yangyang
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 7453 - 7461
  • [32] Stochastic Zeroth-Order Optimization under Nonstationarity and Nonconvexity
    Roy, Abhishek
    Balasubramanian, Krishnakumar
    Ghadimi, Saeed
    Mohapatra, Prasant
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2022, 23
  • [33] Zeroth-Order Phase Transitions
    V. P. Maslov
    [J]. Mathematical Notes, 2004, 76 : 697 - 710
  • [34] Zeroth-order phase transitions
    Maslov, VP
    [J]. MATHEMATICAL NOTES, 2004, 76 (5-6) : 697 - 710
  • [35] Zeroth-Order Methods for Nondifferentiable, Nonconvex, and Hierarchical Federated Optimization
    Qiu, Yuyang
    Shanbhag, Uday V.
    Yousefian, Farzad
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [36] Sequential stochastic blackbox optimization with zeroth-order gradient estimators
    Audet, Charles
    Bigeon, Jean
    Couderc, Romain
    Kokkolaras, Michael
    [J]. AIMS MATHEMATICS, 2023, 8 (11): : 25922 - 25956
  • [37] ZEROTH-ORDER STOCHASTIC PROJECTED GRADIENT DESCENT FOR NONCONVEX OPTIMIZATION
    Liu, Sijia
    Li, Xingguo
    Chen, Pin-Yu
    Haupt, Jarvis
    Amini, Lisa
    [J]. 2018 IEEE GLOBAL CONFERENCE ON SIGNAL AND INFORMATION PROCESSING (GLOBALSIP 2018), 2018, : 1179 - 1183
  • [38] A Generic Approach for Accelerating Stochastic Zeroth-Order Convex Optimization
    Yu, Xiaotian
    King, Irwin
    Lyu, Michael R.
    Yang, Tianbao
    [J]. PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2018, : 3040 - 3046
  • [39] On the Convergence of Prior-Guided Zeroth-Order Optimization Algorithms
    Cheng, Shuyu
    Wu, Guoqiang
    Zhu, Jun
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [40] Zeroth-Order Optimization for Varactor-Tuned Matching Network
    Pirrone, Michelle
    Dall'Anese, Emiliano
    Barton, Taylor
    [J]. 2022 IEEE/MTT-S INTERNATIONAL MICROWAVE SYMPOSIUM (IMS 2022), 2022, : 502 - 505