Calibrated and recalibrated expected improvements for Bayesian optimization

被引:0
|
作者
Zhendong Guo
Yew-Soon Ong
Haitao Liu
机构
[1] Xi’an Jiaotong University,School of Energy and Power Engineering
[2] Nanyang Technological University,Data Science and AI Research Center
[3] Dalian University of Technology,School of Energy and Power Engineering
关键词
Expected improvement; Bayesian optimization; Gaussian process;
D O I
暂无
中图分类号
学科分类号
摘要
Expected improvement (EI), a function of prediction uncertainty σ(x)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\sigma (\mathbf{x})$$\end{document}and improvement quantity (ξ-y^(x))\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ {(\xi - {{\hat{y}}}({\mathbf{x}}))}$$\end{document}, has been widely used to guide the Bayesian optimization (BO). However, the EI-based BO can get stuck in sub-optimal solutions even with a large number of samples. The previous studies attribute such sub-optimal convergence problem to the “over-exploitation” of EI. Differently, we argue that, in addition to the “over-exploitation”, EI can also get trapped in querying samples with maximum σ(x)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$ \sigma ({\mathbf{x}})$$\end{document} but poor objective function value y(x)\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$y(\mathbf{x})$$\end{document}. We call such issue as “over-exploration”, which can be a more challenging problem that leads to the sub-optimal convergence rate of BO. To address the issues of “over-exploration” and “over-exploitation” simultaneously, we propose to calibrate the incumbent ξ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\xi $$\end{document} adaptively instead of fixing it as the present best solution in the EI formulation. Furthermore, we propose two calibrated versions of EI, namely calibrated EI (CEI) and recalibrated EI (REI), which combine the calibrated incumbent ξCalibrated\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\xi ^\text{Calibrated}$$\end{document} with distance constraint to enhance the local exploitation and global exploration of promising areas, respectively. After that, we integrate EI with CEI & REI to devise a novel BO algorithm named as CR-EI. Through tests on seven benchmark functions and an engineering problem of airfoil optimization, the effectiveness of CR-EI has been well demonstrated.
引用
收藏
页码:3549 / 3567
页数:18
相关论文
共 50 条
  • [21] Constrained Bayesian Optimization under Partial Observations: Balanced Improvements and Provable Convergence
    Wang, Shengbo
    Li, Ke
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 14, 2024, : 15607 - 15615
  • [22] Multi-Objective Bayesian Global Optimization using expected hypervolume improvement gradient
    Yang, Kaifeng
    Emmerich, Michael
    Deutz, Andre
    Back, Thomas
    SWARM AND EVOLUTIONARY COMPUTATION, 2019, 44 : 945 - 956
  • [23] AVEI-BO: an efficient Bayesian optimization using adaptively varied expected improvement
    Yan, Cheng
    Du, Han
    Kang, Enzi
    Mi, Dong
    Liu, He
    You, Yancheng
    STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2022, 65 (06)
  • [24] BOKEI: Bayesian optimization using knowledge of correlated torsions and expected improvement for conformer generation
    Chan, Lucian
    Hutchison, Geoffrey R.
    Morris, Garrett M.
    PHYSICAL CHEMISTRY CHEMICAL PHYSICS, 2020, 22 (09) : 5211 - 5219
  • [25] AVEI-BO: an efficient Bayesian optimization using adaptively varied expected improvement
    Cheng Yan
    Han Du
    Enzi Kang
    Dong Mi
    He Liu
    Yancheng You
    Structural and Multidisciplinary Optimization, 2022, 65
  • [26] Assessment of Bayesian expected power via Bayesian bootstrap
    Liu, Fang
    STATISTICS IN MEDICINE, 2018, 37 (24) : 3471 - 3485
  • [27] A Bayesian basket trial design using a calibrated Bayesian hierarchical model
    Chu, Yiyi
    Yuan, Ying
    CLINICAL TRIALS, 2018, 15 (02) : 149 - 158
  • [28] A BAYESIAN FRAMEWORK FOR THE ESTIMATION OF OEF BY CALIBRATED MRI
    Germuska, M.
    Merola, A.
    Stone, A.
    Murphy, K.
    Wise, R.
    JOURNAL OF CEREBRAL BLOOD FLOW AND METABOLISM, 2016, 36 : 398 - 399
  • [29] Calibrated Bayesian credible intervals for binomial proportions
    Lyles, Robert H.
    Weiss, Paul
    Waller, Lance A.
    JOURNAL OF STATISTICAL COMPUTATION AND SIMULATION, 2020, 90 (01) : 75 - 89
  • [30] Multiplicity-calibrated Bayesian hypothesis tests
    Guo, Mengye
    Heitjan, Daniel F.
    BIOSTATISTICS, 2010, 11 (03) : 473 - 483