MUMBO: MUlti-task Max-Value Bayesian Optimization

被引:0
|
作者
Moss, Henry B. [1 ]
Leslie, David S. [2 ]
Rayson, Paul [3 ]
机构
[1] Univ Lancaster, STOR I Ctr Doctoral Training, Lancaster, Lancs, England
[2] Univ Lancaster, Dept Math & Stat, Lancaster, Lancs, England
[3] Univ Lancaster, Sch Comp & Commun, Lancaster, Lancs, England
基金
英国工程与自然科学研究理事会;
关键词
Bayesian optimization; Gaussian processes; FIDELITY; DESIGN; MODELS; OUTPUT;
D O I
10.1007/978-3-030-67664-3_27
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose MUMBO, the first high-performing yet computationally efficient acquisition function for multi-task Bayesian optimization. Here, the challenge is to perform efficient optimization by evaluating low-cost functions somehow related to our true target function. This is a broad class of problems including the popular task of multi-fidelity optimization. However, while information-theoretic acquisition functions are known to provide state-of-the-art Bayesian optimization, existing implementations for multi-task scenarios have prohibitive computational requirements. Previous acquisition functions have therefore been suitable only for problems with both low-dimensional parameter spaces and function query costs sufficiently large to overshadow very significant optimization overheads. In this work, we derive a novel multi-task version of entropy search, delivering robust performance with low computational overheads across classic optimization challenges and multi-task hyper-parameter tuning. MUMBO is scalable and efficient, allowing multi-task Bayesian optimization to be deployed in problems with rich parameter and fidelity spaces.
引用
收藏
页码:447 / 462
页数:16
相关论文
共 50 条
  • [31] Multi-task Crowdsourcing via an Optimization Framework
    Zhou, Yao
    Ying, Lei
    He, Jingrui
    ACM TRANSACTIONS ON KNOWLEDGE DISCOVERY FROM DATA, 2019, 13 (03)
  • [32] Automating Knowledge Transfer with Multi-Task Optimization
    Scott, Eric O.
    De Jong, Kenneth A.
    2019 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2019, : 2252 - 2259
  • [33] Multi-Task Bayesian Compressive Sensing Exploiting Intra-Task Dependency
    Wu, Qisong
    Zhang, Yimin D.
    Amin, Moeness G.
    Himed, Braham
    IEEE SIGNAL PROCESSING LETTERS, 2015, 22 (04) : 430 - 434
  • [34] Multi-task gradient descent for multi-task learning
    Bai, Lu
    Ong, Yew-Soon
    He, Tiantian
    Gupta, Abhishek
    MEMETIC COMPUTING, 2020, 12 (04) : 355 - 369
  • [35] Multi-task gradient descent for multi-task learning
    Lu Bai
    Yew-Soon Ong
    Tiantian He
    Abhishek Gupta
    Memetic Computing, 2020, 12 : 355 - 369
  • [36] Bayesian Multi-Task Learning MPC for Robotic Mobile Manipulation
    Arcari, Elena
    Minniti, Maria Vittoria
    Scampicchio, Anna
    Carron, Andrea
    Farshidian, Farbod
    Hutter, Marco
    Zeilinger, Melanie N.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (06) : 3222 - 3229
  • [37] Multi-Task Bayesian compressive sensing exploiting signal structures
    Liu, Jiahao
    Wu, Qisong
    Amin, M. G.
    SIGNAL PROCESSING, 2021, 178
  • [38] Bayesian Multi-task Learning for Dynamic Time Series Prediction
    Chandra, Rohitash
    Cripps, Sally
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018, : 390 - 397
  • [39] Bayesian online multi-task learning using regularization networks
    Pillonetto, Gianluigi
    Dinuzzo, Francesco
    De Nicolao, Giuseppe
    2008 AMERICAN CONTROL CONFERENCE, VOLS 1-12, 2008, : 4517 - +
  • [40] Bayesian Multi-Task Transfer Learning for Soft Prompt Tuning
    Lee, Haeju
    Jeong, Minchan
    Yun, Se-Young
    Kim, Kee-Eung
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS - EMNLP 2023, 2023, : 4942 - 4958