PSA-CMA-ES: CMA-ES with Population Size Adaptation

被引:15
|
作者
Nishida, Kouhei [1 ]
Akimoto, Youhei [2 ]
机构
[1] Shinshu Univ, Interdisciplinary Grad Sch Sci & Technol, Matsumoto, Nagano, Japan
[2] Univ Tsukuba, Fac Engn Informat & Syst, Tsukuba, Ibaraki, Japan
关键词
CMA-ES; population size adaptation; multimodal functions; noisy functions; EVOLUTION STRATEGY;
D O I
10.1145/3205455.3205467
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The population size, i.e., the number of candidate solutions generated at each iteration, is the most critical strategy parameter in the covariance matrix adaptation evolution strategy, CMA-ES, which is one of the state-of-the-art search algorithms for black-box continuous optimization. The population size is required to be larger than its default value when the objective function is well-structured multimodal and/or noisy, while we want to keep it as small as possible for optimization speed. However, the strategy parameter tuning based on trial and error is, in general, prohibitively expensive in black-box optimization scenario. This paper proposes a novel strategy to adapt the population size for CMA-ES. The population size is adapted based on the estimated accuracy of the update of the normal distribution parameters. The CMA-ES with the proposed population size adaptation mechanism, PSA-CMA-ES, is tested both on noiseless and noisy benchmark functions, and compared with existing strategies. The results revealed that the PSA-CMA-ES works well on well-structured multimodal and/or noisy functions, but causes inefficient increase of the population size on unimodal functions. Furthermore, it is shown that the PSA-CMA-ES can tackle noise and multimodality at the same time.
引用
收藏
页码:865 / 872
页数:8
相关论文
共 50 条
  • [41] A (1+1)-CMA-ES for Constrained Optimisation
    Arnold, Dirk V.
    Hansen, Nikolaus
    [J]. PROCEEDINGS OF THE FOURTEENTH INTERNATIONAL CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE, 2012, : 297 - 304
  • [42] Eigenspace sampling in the mirrored variant of (1, λ)-CMA-ES
    Au, Chun-Kit
    Leung, Ho-Fung
    [J]. 2012 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2012,
  • [43] Handling bound constraints in CMA-ES: An experimental study
    Biedrzycki, Rafal
    [J]. SWARM AND EVOLUTIONARY COMPUTATION, 2020, 52
  • [44] Making EGO and CMA-ES Complementary for Global Optimization
    Mohammadi, Hossein
    Le Riche, Rodolphe
    Touboul, Eric
    [J]. LEARNING AND INTELLIGENT OPTIMIZATION, LION 9, 2015, 8994 : 287 - 292
  • [45] Investigating the Local-Meta-Model CMA-ES for Large Population Sizes
    Bouzarkouna, Zyed
    Auger, Anne
    Ding, Didier Yu
    [J]. APPLICATIONS OF EVOLUTIONARY COMPUTATION, PT I, PROCEEDINGS, 2010, 6024 : 402 - +
  • [46] On the Impact of Active Covariance Matrix Adaptation in the CMA-ES With Mirrored Mutations and Small Initial Population Size on the Noiseless BBOB Testbed
    Brockhoff, Dimo
    Auger, Anne
    Hansen, Nikolaus
    [J]. PROCEEDINGS OF THE FOURTEENTH INTERNATIONAL CONFERENCE ON GENETIC AND EVOLUTIONARY COMPUTATION COMPANION (GECCO'12), 2012, : 291 - 296
  • [47] Computational results for an automatically tuned CMA-ES with increasing population size on the CEC'05 benchmark set
    Liao, Tianjun
    de Oca, Marco A. Montes
    Stutzle, Thomas
    [J]. SOFT COMPUTING, 2013, 17 (06) : 1031 - 1046
  • [48] Computational results for an automatically tuned CMA-ES with increasing population size on the CEC’05 benchmark set
    Tianjun Liao
    Marco A. Montes de Oca
    Thomas Stützle
    [J]. Soft Computing, 2013, 17 : 1031 - 1046
  • [49] Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES)
    Hansen, N
    Muller, SD
    Koumoutsakos, P
    [J]. EVOLUTIONARY COMPUTATION, 2003, 11 (01) : 1 - 18
  • [50] Maximum Likelihood-Based Online Adaptation of Hyper-Parameters in CMA-ES
    Loshchilov, Ilya
    Schoenauer, Marc
    Sebag, Michele
    Hansen, Nikolaus
    [J]. PARALLEL PROBLEM SOLVING FROM NATURE - PPSN XIII, 2014, 8672 : 70 - 79