Calibrated and recalibrated expected improvements for Bayesian optimization

被引:0
|
作者
Zhendong Guo
Yew-Soon Ong
Haitao Liu
机构
[1] Xi’an Jiaotong University,School of Energy and Power Engineering
[2] Nanyang Technological University,Data Science and AI Research Center
[3] Dalian University of Technology,School of Energy and Power Engineering
关键词
Expected improvement; Bayesian optimization; Gaussian process;
D O I
暂无
中图分类号
学科分类号
摘要
Expected improvement (EI), a function of prediction uncertainty σ(x)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\sigma (\mathbf{x})$$\end{document}and improvement quantity (ξ-y^(x))\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {(\xi - {{\hat{y}}}({\mathbf{x}}))}$$\end{document}, has been widely used to guide the Bayesian optimization (BO). However, the EI-based BO can get stuck in sub-optimal solutions even with a large number of samples. The previous studies attribute such sub-optimal convergence problem to the “over-exploitation” of EI. Differently, we argue that, in addition to the “over-exploitation”, EI can also get trapped in querying samples with maximum σ(x)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \sigma ({\mathbf{x}})$$\end{document} but poor objective function value y(x)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$y(\mathbf{x})$$\end{document}. We call such issue as “over-exploration”, which can be a more challenging problem that leads to the sub-optimal convergence rate of BO. To address the issues of “over-exploration” and “over-exploitation” simultaneously, we propose to calibrate the incumbent ξ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\xi $$\end{document} adaptively instead of fixing it as the present best solution in the EI formulation. Furthermore, we propose two calibrated versions of EI, namely calibrated EI (CEI) and recalibrated EI (REI), which combine the calibrated incumbent ξCalibrated\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\xi ^\text{Calibrated}$$\end{document} with distance constraint to enhance the local exploitation and global exploration of promising areas, respectively. After that, we integrate EI with CEI & REI to devise a novel BO algorithm named as CR-EI. Through tests on seven benchmark functions and an engineering problem of airfoil optimization, the effectiveness of CR-EI has been well demonstrated.
引用
收藏
页码:3549 / 3567
页数:18
相关论文
共 50 条
  • [1] Calibrated and recalibrated expected improvements for Bayesian optimization
    Guo, Zhendong
    Ong, Yew-Soon
    Liu, Haitao
    STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2021, 64 (06) : 3549 - 3567
  • [2] Unexpected Improvements to Expected Improvement for Bayesian Optimization
    Ament, Sebastian
    Daulton, Samuel
    Eriksson, David
    Balandat, Maximilian
    Bakshy, Eytan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [3] A Bayesian expression for the recalibrated value
    Tuninsky, V
    Woger, W
    1998 CONFERENCE ON PRECISION ELECTROMAGNETIC MEASUREMENTS DIGEST, 1998, : 52 - 53
  • [4] Robust expected improvement for Bayesian optimization
    Christianson, Ryan B.
    Gramacy, Robert B.
    IISE TRANSACTIONS, 2024, 56 (12) : 1294 - 1306
  • [5] Online Calibrated and Conformal Prediction Improves Bayesian Optimization
    Deshpande, Shachi
    Marx, Charles
    Kuleshov, Volodymyr
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 238, 2024, 238
  • [6] Exploration Enhanced Expected Improvement for Bayesian Optimization
    Berk, Julian
    Vu Nguyen
    Gupta, Sunil
    Rana, Santu
    Venkatesh, Svetha
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2018, PT II, 2019, 11052 : 621 - 637
  • [7] A Hierarchical Expected Improvement Method for Bayesian Optimization
    Chen, Zhehui
    Mak, Simon
    Wu, C. F. Jeff
    JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION, 2024, 119 (546) : 1619 - 1632
  • [8] Self-Adjusting Weighted Expected Improvement for Bayesian Optimization
    Benjamins, Carolin
    Raponi, Elena
    Jankovic, Anja
    Doerr, Carola
    Lindauer, Marius
    INTERNATIONAL CONFERENCE ON AUTOMATED MACHINE LEARNING, VOL 224, 2023, 224
  • [9] Expected coordinate improvement for high-dimensional Bayesian optimization
    School of Computing and Artificial Intelligence, Southwest Jiaotong University, Chengdu, China
    Swarm Evol. Comput.,
  • [10] Towards Self-Adjusting Weighted Expected Improvement for Bayesian Optimization
    Benjamins, Carolin
    Raponi, Elena
    Jankovic, Anja
    Doerr, Carola
    Lindauer, Marius
    PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION, 2023, : 483 - 486