Distributed gradient-free and projection-free algorithm for stochastic constrained optimization

被引:0
|
作者
Hou J. [1 ]
Zeng X. [1 ]
Chen C. [1 ]
机构
[1] National Key Laboratory of Autonomous Intelligent Unmanned Systems, School of Automation, Beijing Institute of Technology, Beijing
来源
Autonomous Intelligent Systems | 2024年 / 4卷 / 01期
基金
中国国家自然科学基金;
关键词
Distributed optimization; Projection-free method; Stochastic constrained optimization; Zeroth-order optimization;
D O I
10.1007/s43684-024-00062-0
中图分类号
学科分类号
摘要
Distributed stochastic zeroth-order optimization (DSZO), in which the objective function is allocated over multiple agents and the derivative of cost functions is unavailable, arises frequently in large-scale machine learning and reinforcement learning. This paper introduces a distributed stochastic algorithm for DSZO in a projection-free and gradient-free manner via the Frank-Wolfe framework and the stochastic zeroth-order oracle (SZO). Such a scheme is particularly useful in large-scale constrained optimization problems where calculating gradients or projection operators is impractical, costly, or when the objective function is not differentiable everywhere. Specifically, the proposed algorithm, enhanced by recursive momentum and gradient tracking techniques, guarantees convergence with just a single batch per iteration. This significant improvement over existing algorithms substantially lowers the computational complexity. Under mild conditions, we prove that the complexity bounds on SZO of the proposed algorithm are O(n/ϵ2) and O(n(21)) for convex and nonconvex cases, respectively. The efficacy of the algorithm is verified on black-box binary classification problems against several competing alternatives. © The Author(s) 2024.
引用
收藏
相关论文
共 50 条
  • [31] Reactive Power Optimization for Distribution Network Based on Distributed Random Gradient-Free Algorithm
    Xie, Jun
    Liang, Chunxiang
    Xiao, Yichen
    ENERGIES, 2018, 11 (03):
  • [32] Distributed Event-Triggered Random Gradient-Free Optimization Algorithm For Multiagent Systems
    Hu, Xiaojing
    Zhang, Huifeng
    Zhuo, Qingze
    2021 PROCEEDINGS OF THE 40TH CHINESE CONTROL CONFERENCE (CCC), 2021, : 4972 - 4977
  • [33] A stochastic subspace approach to gradient-free optimization in high dimensions
    Kozak, David
    Becker, Stephen
    Doostan, Alireza
    Tenorio, Luis
    COMPUTATIONAL OPTIMIZATION AND APPLICATIONS, 2021, 79 (02) : 339 - 368
  • [34] Projection-Free Bandit Convex Optimization
    Chen, Lin
    Zhang, Mingrui
    Karbasi, Amin
    22ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 89, 2019, 89
  • [35] A conjecture on global optimization using gradient-free stochastic approximation
    Maryak, JL
    Chin, DC
    JOINT CONFERENCE ON THE SCIENCE AND TECHNOLOGY OF INTELLIGENT SYSTEMS, 1998, : 441 - 445
  • [36] Gradient-Free Methods for Deterministic and Stochastic Nonsmooth Nonconvex Optimization
    Lin, Tianyi
    Zheng, Zeyu
    Jordan, Michael I.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [37] Projection-Free Bandit Optimization with Privacy Guarantees
    Ene, Alina
    Nguyen, Huy L.
    Vladu, Adrian
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7322 - 7330
  • [38] Projection-free Distributed Online Learning in Networks
    Zhang, Wenpeng
    Zhao, Peilin
    Zhu, Wenwu
    Hoi, Steven C. H.
    Zhang, Tong
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 70, 2017, 70
  • [39] Strong consistency of random gradient-free algorithms for distributed optimization
    Chen, Xing-Min
    Gao, Chao
    OPTIMAL CONTROL APPLICATIONS & METHODS, 2017, 38 (02): : 247 - 265
  • [40] A stochastic subspace approach to gradient-free optimization in high dimensions
    David Kozak
    Stephen Becker
    Alireza Doostan
    Luis Tenorio
    Computational Optimization and Applications, 2021, 79 : 339 - 368