Combining Bayesian optimization and Lipschitz optimization

被引:0
|
作者
Mohamed Osama Ahmed
Sharan Vaswani
Mark Schmidt
机构
[1] University of British Columbia,
来源
Machine Learning | 2020年 / 109卷
关键词
Bayesian optimization; Global optimization; Lipschitz optimzation; Optimization;
D O I
暂无
中图分类号
学科分类号
摘要
Bayesian optimization and Lipschitz optimization have developed alternative techniques for optimizing black-box functions. They each exploit a different form of prior about the function. In this work, we explore strategies to combine these techniques for better global optimization. In particular, we propose ways to use the Lipschitz continuity assumption within traditional BO algorithms, which we call Lipschitz Bayesian optimization (LBO). This approach does not increase the asymptotic runtime and in some cases drastically improves the performance (while in the worst case the performance is similar). Indeed, in a particular setting, we prove that using the Lipschitz information yields the same or a better bound on the regret compared to using Bayesian optimization on its own. Moreover, we propose a simple heuristics to estimate the Lipschitz constant, and prove that a growing estimate of the Lipschitz constant is in some sense “harmless”. Our experiments on 15 datasets with 4 acquisition functions show that in the worst case LBO performs similar to the underlying BO method while in some cases it performs substantially better. Thompson sampling in particular typically saw drastic improvements (as the Lipschitz information corrected for its well-known “over-exploration” pheonemon) and its LBO variant often outperformed other acquisition functions.
引用
收藏
页码:79 / 102
页数:23
相关论文
共 50 条
  • [31] Exploring Bayesian optimization
    2020, Distill Working Group (05):
  • [32] Stable Bayesian Optimization
    Thanh Dai Nguyen
    Gupta, Sunil
    Rana, Santu
    Venkatesh, Svetha
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2017, PT II, 2017, 10235 : 578 - 591
  • [33] Causal Bayesian Optimization
    Aglietti, Virginia
    Lu, Xiaoyu
    Paleyes, Andrei
    Gonzalez, Javier
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 108, 2020, 108 : 3155 - 3164
  • [34] Fair Bayesian Optimization
    Perrone, Valerio
    Donini, Michele
    Zafar, Muhammad Bilal
    Schmucker, Robin
    Kenthapadi, Krishnaram
    Archambeau, Cedric
    AIES '21: PROCEEDINGS OF THE 2021 AAAI/ACM CONFERENCE ON AI, ETHICS, AND SOCIETY, 2021, : 854 - 863
  • [35] Bayesian Functional Optimization
    Ngo Anh Vien
    Zimmermann, Heiko
    Toussaint, Marc
    THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 4171 - 4178
  • [36] Stable Bayesian optimization
    Thanh Dai Nguyen
    Gupta, Sunil
    Rana, Santu
    Venkatesh, Svetha
    INTERNATIONAL JOURNAL OF DATA SCIENCE AND ANALYTICS, 2018, 6 (04) : 327 - 339
  • [37] Sparse Bayesian Optimization
    Liu, Sulin
    Feng, Qing
    Eriksson, David
    Letham, Benjamin
    Bakshy, Eytan
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 206, 2023, 206
  • [38] Quantum Bayesian Optimization
    Dai, Zhongxiang
    Lau, Gregory Kang Ruey
    Verma, Arun
    Shu, Yao
    Low, Bryan Kian Hsiang
    Jaillet, Patrick
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [39] TUTORIAL: BAYESIAN OPTIMIZATION
    Couckuyt, Ivo
    Gonzalez, Sebastian Rojas
    Branke, Juergen
    Bischl, Bernd
    PROCEEDINGS OF THE 2023 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE COMPANION, GECCO 2023 COMPANION, 2023, : 895 - 912
  • [40] Bayesian Optimization with Gradients
    Wu, Jian
    Poloczek, Matthias
    Wilson, Andrew Gordon
    Frazier, Peter, I
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30