Distributed Evolution Strategies for Black-Box Stochastic Optimization

被引:6
|
作者
He, Xiaoyu [1 ,2 ]
Zheng, Zibin [1 ]
Chen, Chuan [1 ]
Zhou, Yuren [1 ]
Luo, Chuan [3 ]
Lin, Qingwei [4 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
[2] Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
[3] Beihang Univ, Sch Software, Beijing 100191, Peoples R China
[4] Microsoft Res, Beijing 100080, Peoples R China
基金
中国国家自然科学基金;
关键词
Smoothing methods; Stochastic processes; Convergence; Optimization methods; Machine learning; Linear programming; Distributed databases; Evolution strategies; distributed optimization; black-box optimization; stochastic optimization; zeroth-order methods; ALGORITHMS; CONVERGENCE;
D O I
10.1109/TPDS.2022.3168873
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
This work concerns the evolutionary approaches to distributed stochastic black-box optimization, in which each worker can individually solve an approximation of the problem with nature-inspired algorithms. We propose a distributed evolution strategy (DES) algorithm grounded on a proper modification to evolution strategies, a family of classic evolutionary algorithms, as well as a careful combination with existing distributed frameworks. On smooth and nonconvex landscapes, DES has a convergence rate competitive to existing zeroth-order methods, and can exploit the sparsity, if applicable, to match the rate of first-order methods. The DES method uses a Gaussian probability model to guide the search and avoids the numerical issue resulted from finite-difference techniques in existing zeroth-order methods. The DES method is also fully adaptive to the problem landscape, as its convergence is guaranteed with any parameter setting. We further propose two alternative sampling schemes which significantly improve the sampling efficiency while leading to similar performance. Simulation studies on several machine learning problems suggest that the proposed methods show much promise in reducing the convergence time and improving the robustness to parameter settings.
引用
收藏
页码:3718 / 3731
页数:14
相关论文
共 50 条
  • [1] Distributed Evolution Strategies With Multi-Level Learning for Large-Scale Black-Box Optimization
    Duan, Qiqi
    Shao, Chang
    Zhou, Guochen
    Zhang, Minghan
    Zhao, Qi
    Shi, Yuhui
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2024, 35 (11) : 2087 - 2101
  • [2] DISTRIBUTED BLACK-BOX OPTIMIZATION OF NONCONVEX FUNCTIONS
    Valcarcel Macua, Sergio
    Zazo, Santiago
    Zazo, Javier
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 3591 - 3595
  • [3] Investigating the Impact of Adaptation Sampling in Natural Evolution Strategies on Black-box Optimization Testbeds
    Schaul, Tom
    PROCEEDINGS OF THE FOURTEENTH INTERNATIONAL CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION COMPANION (GECCO'12), 2012, : 221 - 228
  • [4] Benchmarking Separable Natural Evolution Strategies on the Noiseless and Noisy Black-box Optimization Testbeds
    Schaul, Tom
    PROCEEDINGS OF THE FOURTEENTH INTERNATIONAL CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION COMPANION (GECCO'12), 2012, : 205 - 212
  • [5] An Evolution Strategy for Black-box Optimization on Matrix Manifold
    He X.-Y.
    Zhou Y.-R.
    Chen Z.-F.
    Jisuanji Xuebao/Chinese Journal of Computers, 2020, 43 (09): : 1604 - 1623
  • [6] Benchmarking Exponential Natural Evolution Strategies on the Noiseless and Noisy Black-box Optimization Testbeds
    Schaul, Tom
    PROCEEDINGS OF THE FOURTEENTH INTERNATIONAL CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION COMPANION (GECCO'12), 2012, : 213 - 220
  • [7] Distributionally Constrained Black-Box Stochastic Gradient Estimation and Optimization
    Lam, Henry
    Zhang, Junhui
    OPERATIONS RESEARCH, 2024,
  • [8] Distributed Black-Box Optimization via Error Correcting Codes
    Bartan, Burak
    Pilanci, Mert
    2019 57TH ANNUAL ALLERTON CONFERENCE ON COMMUNICATION, CONTROL, AND COMPUTING (ALLERTON), 2019, : 246 - 252
  • [9] Optimistic tree search strategies for black-box combinatorial optimization
    Malherbe, Cedric
    Grosnit, Antoine
    Tutunov, Rasul
    Wang, Jun
    Bou-Ammar, Haitham
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [10] Benchmarking Natural Evolution Strategies with Adaptation Sampling on the Noiseless and Noisy Black-box Optimization Testbeds
    Schaul, Tom
    PROCEEDINGS OF THE FOURTEENTH INTERNATIONAL CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION COMPANION (GECCO'12), 2012, : 229 - 236