Multi-fidelity optimization method with Asynchronous Generalized Island Model for AutoML

被引:0
|
作者
Jurado, Israel Campero [1 ]
Vanschoren, Joaquin [1 ]
机构
[1] Univ Eindhoven, Eindhoven, Netherlands
关键词
Genetic programming; Machine learning; Parallelization; Parameter tuning and Algorithm configuration;
D O I
10.1145/3520304.3528917
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
AutoML frameworks seek to find the best configurations of machine learning techniques. A research field that has gained a great deal of popularity in recent years is the combined algorithm selection and hyperparameter optimisation (CASH) problem. Several bio-inspired optimization techniques have been applied in AutoML, each with their drawbacks and benefits. For instance, methods may get stuck evaluating computationally expensive models, or certain solutions may dominate early on and inhibit the discovery of better ones. We propose to use multiple bio-inspired techniques in parallel in a generalized island model & combine this with multi-fidelity optimization to speed up the search. We analyze 3 different island topologies, including fully connected, unconnected, and ring topologies, to understand the trade-offs between information sharing and maintaining diversity. With respect to convergence time, the proposed method outperforms Asynchronous Evolutionary Algorithms and Asynchronous Successive Halving techniques. In an objective comparison based on the OpenML AutoML Benchmark, we also find that the proposed method is competitive with current state-of-the-art AutoML frameworks such as TPOT, AutoWEKA, AutoSklearn, H2O AutoML, GAMA, Asynchronous Successive Halving, and random search.
引用
收藏
页码:220 / 223
页数:4
相关论文
共 50 条
  • [1] Combining multi-fidelity modelling and asynchronous batch Bayesian Optimization
    Folch, Jose Pablo
    Lee, Robert M.
    Shafei, Behrang
    Walz, David
    Tsay, Calvin
    van der Wilk, Mark
    Misener, Ruth
    COMPUTERS & CHEMICAL ENGINEERING, 2023, 172
  • [2] Fast Benchmarking of Asynchronous Multi-Fidelity Optimization on Zero-Cost Benchmarks
    Watanabe, Shuhei
    Mallik, Neeratyoy
    Bergman, Edward
    Hutter, Frank
    INTERNATIONAL CONFERENCE ON AUTOMATED MACHINE LEARNING, 2024, 256
  • [3] A Multi-Fidelity Surrogate Optimization Method Based on Analytical Models
    Sendrea, Ricardo E.
    Zekios, Constantinos L.
    Georgakopoulos, Stavros, V
    2021 IEEE MTT-S INTERNATIONAL MICROWAVE SYMPOSIUM (IMS), 2021, : 70 - 73
  • [4] An adaptive space preselection method for the multi-fidelity global optimization
    Wu, Yuda
    Lin, Quan
    Zhou, Qi
    Hu, Jiexiang
    Wang, Shengyi
    Peng, Yutong
    Aerospace Science and Technology, 2021, 113
  • [5] An adaptive space preselection method for the multi-fidelity global optimization
    Wu, Yuda
    Lin, Quan
    Zhou, Qi
    Hu, Jiexiang
    Wang, Shengyi
    Peng, Yutong
    AEROSPACE SCIENCE AND TECHNOLOGY, 2021, 113
  • [6] Novel multi-fidelity surrogate model assisted many-objective optimization method
    Zhao H.
    Gao Z.
    Xia L.
    Hangkong Xuebao/Acta Aeronautica et Astronautica Sinica, 2023, 44 (06):
  • [7] EFFICIENT MULTI-FIDELITY SIMULATION OPTIMIZATION
    Xu, Jie
    Zhang, Si
    Huang, Edward
    Chen, Chun-Hung
    Lee, Loo Hay
    Celik, Nurcin
    PROCEEDINGS OF THE 2014 WINTER SIMULATION CONFERENCE (WSC), 2014, : 3940 - 3951
  • [8] GAR: Generalized Autoregression for Multi-Fidelity Fusion
    Wang, Yuxin
    Xing, Zheng
    Xing, Wei W.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] A Multi-Fidelity Successive Response Surface Method for Crashworthiness Optimization Problems
    Lualdi, Pietro
    Sturm, Ralf
    Siefkes, Tjark
    APPLIED SCIENCES-BASEL, 2023, 13 (20):
  • [10] Parallel multi-fidelity expected improvement method for efficient global optimization
    Guo, Zhendong
    Wang, Qineng
    Song, Liming
    Li, Jun
    STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION, 2021, 64 (03) : 1457 - 1468