Robust expected improvement for Bayesian optimization

被引:2
|
作者
Christianson, Ryan B. [1 ]
Gramacy, Robert B. [2 ]
机构
[1] Univ Chicago, NORC, Dept Stat & Data Sci, Chicago, IL 60637 USA
[2] Virginia Tech, Dept Stat, Blacksburg, VA USA
关键词
Robust optimization; Gaussian process; active learning; sequential design; EFFICIENT GLOBAL OPTIMIZATION; SIMULATION; ALGORITHM; SEARCH;
D O I
10.1080/24725854.2023.2275166
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Bayesian Optimization (BO) links Gaussian Process (GP) surrogates with sequential design toward optimizing expensive-to-evaluate black-box functions. Example design heuristics, or so-called acquisition functions, like expected improvement, balance exploration and exploitation to furnish global solutions under stringent evaluation budgets. However, they fall short when solving for robust optima, meaning a preference for solutions in a wider domain of attraction. Robust solutions are useful when inputs are imprecisely specified, or where a series of solutions is desired. A common mathematical programming technique in such settings involves an adversarial objective, biasing a local solver away from "sharp" troughs. Here we propose a surrogate modeling and active learning technique called robust expected improvement that ports adversarial methodology into the BO/GP framework. After describing the methods, we illustrate and draw comparisons to several competitors on benchmark synthetic exercises and real problems of varying complexity.
引用
收藏
页码:1294 / 1306
页数:13
相关论文
共 50 条
  • [1] Unexpected Improvements to Expected Improvement for Bayesian Optimization
    Ament, Sebastian
    Daulton, Samuel
    Eriksson, David
    Balandat, Maximilian
    Bakshy, Eytan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [2] Exploration Enhanced Expected Improvement for Bayesian Optimization
    Berk, Julian
    Vu Nguyen
    Gupta, Sunil
    Rana, Santu
    Venkatesh, Svetha
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT II, 2019, 11052 : 621 - 637
  • [3] A Hierarchical Expected Improvement Method for Bayesian Optimization
    Chen, Zhehui
    Mak, Simon
    Wu, C. F. Jeff
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024, 119 (546) : 1619 - 1632
  • [4] Self-Adjusting Weighted Expected Improvement for Bayesian Optimization
    Benjamins, Carolin
    Raponi, Elena
    Jankovic, Anja
    Doerr, Carola
    Lindauer, Marius
    INTERNATIONAL CONFERENCE ON AUTOMATED MACHINE LEARNING, VOL 224, 2023, 224
  • [5] Expected coordinate improvement for high-dimensional Bayesian optimization
    School of Computing and Artificial Intelligence, Southwest Jiaotong University, Chengdu, China
    Swarm Evol. Comput.,
  • [6] Towards Self-Adjusting Weighted Expected Improvement for Bayesian Optimization
    Benjamins, Carolin
    Raponi, Elena
    Jankovic, Anja
    Doerr, Carola
    Lindauer, Marius
    PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION, 2023, : 483 - 486
  • [7] Parallel Bayesian Optimization of Multiple Noisy Objectives with Expected Hypervolume Improvement
    Daulton, Samuel
    Balandat, Maximilian
    Bakshy, Eytan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [8] Efficient hybrid Bayesian optimization algorithm with adaptive expected improvement acquisition function
    Xu, Zhaoyi
    Guo, Yanjie
    Saleh, Joseph H.
    ENGINEERING OPTIMIZATION, 2021, 53 (10) : 1786 - 1804
  • [9] A multi-fidelity Bayesian optimization approach based on the expected further improvement
    Leshi Shu
    Ping Jiang
    Yan Wang
    Structural and Multidisciplinary Optimization, 2021, 63 : 1709 - 1719
  • [10] A multi-fidelity Bayesian optimization approach based on the expected further improvement
    Shu, Leshi
    Jiang, Ping
    Wang, Yan
    STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2021, 63 (04) : 1709 - 1719