Automated machine learning hyperparameters tuning through meta-guided Bayesian optimization

被引:0
|
作者
Garouani, Moncef [1 ,2 ]
Bouneffa, Mourad [1 ]
机构
[1] Univ Littoral Cote dOpale, Lab Informat Signal & Image Cote Opale LISIC, UR 4491, F-62228 Calais, France
[2] Univ Toulouse Capitole, IRIT, CNRS, UMR 5505, F-31000 Toulouse, France
关键词
Hyperparameters optimization; Bayesian optimization; Meta-learning; Meta-Guided Bayesian Optimization; fANOVA; GAUSSIAN-PROCESSES; SELECTION;
D O I
10.1007/s13748-023-00311-y
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The selection of one or more optimized Machine Learning (ML) algorithms and the configuration of significant hyperparameters are among the crucial but challenging tasks for the advanced data analytics using ML methodologies. However, it is one of the essential tasks in order to apply the ML-based solutions to deal with the real-world problems. In this regard, Bayesian Optimization (BO) is a popular method for optimizing black-box functions. But, yet it is deficient for large-scale problems because it fails to leverage the knowledge from historical applications. The major challenge in this aspect is due to the waste of function evaluations on bad design choices of ML hyperparameters. To address this issue, we propose to integrate Bayesian Optimization via Meta-Guidance. Consequently, Meta-Guided Bayesian Optimization provides means to use the knowledge from previous optimization cycles on similar tasks. This capability takes the form of pre-requisite to decide the specific parts of the input space to be evaluated next. In this regard, we intend to guide the BO with a functional ANOVA of configurations as suggested by a meta-learning process. In this paper, we demonstrate, with the help of a large collection of hyperparameters optimization benchmark, that the proposed Meta-Guided Optimization approach is about 3 times faster than the vanilla BO. Thence, it achieves a new state-of-the-art performance as proved by the experiments on 09 classification datasets.
引用
收藏
页数:12
相关论文
共 50 条
  • [11] Improving Hardenability Modeling: A Bayesian Optimization Approach to Tuning Hyperparameters for Neural Network Regression
    Gemechu, Wendimu Fanta
    Sitek, Wojciech
    Batalha, Gilmar Ferreira
    APPLIED SCIENCES-BASEL, 2024, 14 (06):
  • [12] Optimization on selecting XGBoost hyperparameters using meta-learning
    Lima Marinho, Tiago
    do Nascimento, Diego Carvalho
    Pimentel, Bruno Almeida
    EXPERT SYSTEMS, 2024, 41 (09)
  • [13] Efficient hyperparameters optimization through model-based reinforcement learning with experience exploiting and meta-learning
    Liu, Xiyuan
    Wu, Jia
    Chen, Senpeng
    SOFT COMPUTING, 2023, 27 (13) : 8661 - 8678
  • [14] Efficient hyperparameters optimization through model-based reinforcement learning with experience exploiting and meta-learning
    Xiyuan Liu
    Jia Wu
    Senpeng Chen
    Soft Computing, 2023, 27 : 8661 - 8678
  • [15] Effective Diagnosis Approach for Broken Rotor Bar Fault Using Bayesian-Based Optimization of Machine Learning Hyperparameters
    Bechiri, Mohammed Bachir
    Allal, Abderrahim
    Naoui, Mohamed
    Khechekhouche, Abderrahmane
    Alsaif, Haitham
    Boudjemline, Attia
    Alshammari, Badr M.
    Alqunun, Khalid
    Guesmi, Tawfik
    IEEE ACCESS, 2024, 12 : 139923 - 139936
  • [16] Improving Genomic Prediction with Machine Learning Incorporating TPE for Hyperparameters Optimization
    Liang, Mang
    An, Bingxing
    Li, Keanning
    Du, Lili
    Deng, Tianyu
    Cao, Sheng
    Du, Yueying
    Xu, Lingyang
    Gao, Xue
    Zhang, Lupei
    Li, Junya
    Gao, Huijiang
    BIOLOGY-BASEL, 2022, 11 (11):
  • [17] Spiderweb Nanomechanical Resonators via Bayesian Optimization: Inspired by Nature and Guided by Machine Learning
    Shin, Dongil
    Cupertino, Andrea
    de Jong, Matthijs H. J.
    Steeneken, Peter G.
    Bessa, Miguel A.
    Norte, Richard A.
    ADVANCED MATERIALS, 2022, 34 (03)
  • [18] Machine Learning Assisted Hyperparameter Tuning for Optimization
    Linkous, Lauren
    Lundquist, Jonathan
    Suche, Michael
    Topsakal, Erdem
    2024 IEEE INC-USNC-URSI RADIO SCIENCE MEETING (JOINT WITH AP-S SYMPOSIUM), 2024, : 107 - 108
  • [19] Automated open-stope design optimization through machine learning methods
    Varela, Nelson Morales
    Retamal, Aldo Quelopana
    INTERNATIONAL JOURNAL OF MINING RECLAMATION AND ENVIRONMENT, 2025, 39 (04) : 273 - 292
  • [20] Bayesian networks for interpretable machine learning and optimization
    Mihaljevic, Bojan
    Bielza, Concha
    Larranaga, Pedro
    NEUROCOMPUTING, 2021, 456 : 648 - 665