Towards Safe Multi-Task Bayesian Optimization

被引:0
|
作者
Luebsen, Jannis O. [1 ]
Hespe, Christian [1 ]
Eichler, Annika [1 ,2 ]
机构
[1] Hamburg Univ Technol, Hamburg, Germany
[2] Deutsch Elektronen Synchrotron DESY, Hamburg, Germany
关键词
Bayesian Optimization; Gaussian Processes; Controller Tuning; Safe Optimization; BOUNDS;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Bayesian optimization has emerged as a highly effective tool for the safe online optimization of systems, due to its high sample efficiency and noise robustness. To further enhance its efficiency, reduced physical models of the system can be incorporated into the optimization process, accelerating it. These models are able to offer an approximation of the actual system, and evaluating them is significantly cheaper. The similarity between the model and reality is represented by additional hyperparameters, which are learned within the optimization process. Safety is a crucial criterion for online optimization methods such as Bayesian optimization, which has been addressed by recent works that provide safety guarantees under the assumption of known hyperparameters. In practice, however, this does not apply. Therefore, we extend the robust Gaussian process uniform error bounds to meet the multi-task setting, which involves the calculation of a confidence region from the hyperparameter posterior distribution utilizing Markov chain Monte Carlo methods. Subsequently, the robust safety bounds are employed to facilitate the safe optimization of the system, while incorporating measurements of the models. Simulation results indicate that the optimization can be significantly accelerated for expensive to evaluate functions in comparison to other state-of-the-art safe Bayesian optimization methods, contingent on the fidelity of the models. The code is accessible on GitHub(1).
引用
收藏
页码:839 / 851
页数:13
相关论文
共 50 条
  • [21] Bayesian Multi-Task Relationship Learning with Link Structure
    Li, Yingming
    Yang, Ming
    Qi, Zhongang
    Zhang, Zhongfei
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2016, 28 (04) : 873 - 887
  • [22] Multi-task Transfer Learning for Bayesian Network Structures
    Benikhlef, Sarah
    Leray, Philippe
    Raschia, Guillaume
    Ben Messaoud, Montassar
    Sakly, Fayrouz
    SYMBOLIC AND QUANTITATIVE APPROACHES TO REASONING WITH UNCERTAINTY, ECSQARU 2021, 2021, 12897 : 217 - 228
  • [23] Multi-Task Learning as Multi-Objective Optimization
    Sener, Ozan
    Koltun, Vladlen
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [24] Towards multi-task transfer optimization of cloud service collaboration in industrial internet platform
    Zhou, Jiajun
    Gao, Liang
    Lu, Chao
    Yao, Xifan
    ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, 2023, 80
  • [25] Multi-task Crowdsourcing via an Optimization Framework
    Zhou, Yao
    Ying, Lei
    He, Jingrui
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2019, 13 (03)
  • [26] Automating Knowledge Transfer with Multi-Task Optimization
    Scott, Eric O.
    De Jong, Kenneth A.
    2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 2252 - 2259
  • [27] Multi-Task Bayesian Compressive Sensing Exploiting Intra-Task Dependency
    Wu, Qisong
    Zhang, Yimin D.
    Amin, Moeness G.
    Himed, Braham
    IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (04) : 430 - 434
  • [28] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [29] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [30] Towards Mixture of Task-Intensive Experts for Multi-task Recommendation
    Cai, Xun
    Lu, Yuxiang
    Lu, Hongtao
    Ding, Yue
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2024, PT 3, 2025, 14852 : 323 - 332