Distributed gradient-free and projection-free algorithm for stochastic constrained optimization

被引:0
|
作者
Hou J. [1 ]
Zeng X. [1 ]
Chen C. [1 ]
机构
[1] National Key Laboratory of Autonomous Intelligent Unmanned Systems, School of Automation, Beijing Institute of Technology, Beijing
来源
Autonomous Intelligent Systems | 2024年 / 4卷 / 01期
基金
中国国家自然科学基金;
关键词
Distributed optimization; Projection-free method; Stochastic constrained optimization; Zeroth-order optimization;
D O I
10.1007/s43684-024-00062-0
中图分类号
学科分类号
摘要
Distributed stochastic zeroth-order optimization (DSZO), in which the objective function is allocated over multiple agents and the derivative of cost functions is unavailable, arises frequently in large-scale machine learning and reinforcement learning. This paper introduces a distributed stochastic algorithm for DSZO in a projection-free and gradient-free manner via the Frank-Wolfe framework and the stochastic zeroth-order oracle (SZO). Such a scheme is particularly useful in large-scale constrained optimization problems where calculating gradients or projection operators is impractical, costly, or when the objective function is not differentiable everywhere. Specifically, the proposed algorithm, enhanced by recursive momentum and gradient tracking techniques, guarantees convergence with just a single batch per iteration. This significant improvement over existing algorithms substantially lowers the computational complexity. Under mild conditions, we prove that the complexity bounds on SZO of the proposed algorithm are O(n/ϵ2) and O(n(21)) for convex and nonconvex cases, respectively. The efficacy of the algorithm is verified on black-box binary classification problems against several competing alternatives. © The Author(s) 2024.
引用
收藏
相关论文
共 50 条
  • [1] Accelerated Stochastic Gradient-free and Projection-free Methods
    Huang, Feihu
    Tao, Lue
    Chen, Songcan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [2] Distributed projection-free algorithm for constrained aggregative optimization
    Wang, Tongyu
    Yi, Peng
    INTERNATIONAL JOURNAL OF ROBUST AND NONLINEAR CONTROL, 2023, 33 (10) : 5273 - 5288
  • [3] A Projection-free Algorithm for Constrained Stochastic Multi-level Composition Optimization
    Xiao, Tesi
    Balasubramanian, Krishnakumar
    Ghadimi, Saeed
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [4] Distributed Randomized Gradient-Free Mirror Descent Algorithm for Constrained Optimization
    Yu, Zhan
    Ho, Daniel W. C.
    Yuan, Deming
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2022, 67 (02) : 957 - 964
  • [5] Projection-Free Online Optimization with Stochastic Gradient: From Convexity to Submodularity
    Chen, Lin
    Harshaw, Christopher
    Hassani, Hamed
    Karbasi, Amin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 80, 2018, 80
  • [6] An efficient gradient-free projection algorithm for constrained nonlinear equations and image restoration
    Ibrahim, Abdulkarim Hassan
    Kumam, Poom
    Abubakar, Auwal Bala
    Yusuf, Umar Batsari
    Yimer, Seifu Endris
    Aremu, Kazeem Olalekan
    AIMS MATHEMATICS, 2021, 6 (01): : 235 - 260
  • [7] Projection-free nonconvex stochastic optimization on Riemannian manifolds
    Weber, Melanie
    Sra, Suvrit
    IMA JOURNAL OF NUMERICAL ANALYSIS, 2022, 42 (04) : 3241 - 3271
  • [8] Projection-Free Stochastic Bi-Level Optimization
    Akhtar, Zeeshan
    Bedi, Amrit Singh
    Thomdapu, Srujan Teja
    Rajawat, Ketan
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2022, 70 : 6332 - 6347
  • [9] Quantized Distributed Online Projection-Free Convex Optimization
    Zhang, Wentao
    Shi, Yang
    Zhang, Baoyong
    Lu, Kaihong
    Yuan, Deming
    IEEE CONTROL SYSTEMS LETTERS, 2023, 7 : 1837 - 1842
  • [10] Distributed Online Optimization With Gradient-free Design
    Wang, Lingfei
    Wang, Yinghui
    Hong, Yiguang
    PROCEEDINGS OF THE 38TH CHINESE CONTROL CONFERENCE (CCC), 2019, : 5677 - 5682