A Population-Based Hybrid Approach for Hyperparameter Optimization of Neural Networks

被引:7
|
作者
Japa, Luis [1 ]
Serqueira, Marcello [2 ]
Mendonca, Israel [1 ]
Aritsugi, Masayoshi [3 ]
Bezerra, Eduardo [2 ]
Gonzalez, Pedro Henrique [4 ]
机构
[1] Kumamoto Univ, Grad Sch Sci & Technol, Kumamoto 8608555, Japan
[2] Fed Ctr Technol Educ Rio De Janeiro CEFET RJ, BR-20271110 Rio De Janeiro, Brazil
[3] Kumamoto Univ, Fac Adv Sci & Technol, Kumamoto 8608555, Japan
[4] Univ Fed Rio de Janeiro, Syst Engn & Comp Sci Postgrad Program, BR-21941914 Rio De Janeiro, Brazil
关键词
Genetic algorithms; hyperparameter optimization; machine learning; KEY GENETIC ALGORITHM;
D O I
10.1109/ACCESS.2023.3277310
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Hyperparameter optimization is a fundamental part of Auto Machine Learning (AutoML) and it has been widely researched in recent years; however, it still remains as one of the main challenges in this area. Motivated by the need of faster and more accurate hyperparameter optimization algorithms we developed HyperBRKGA, a new population-based approach for hyperparameter optimization. HyperBRKGA combines the Biased Random Key Genetic Algorithm with an Exploitation Method in order to search the hyperparameter space more efficiently than other commonly used hyperparameter optimization algorithms, such as Grid Search, Random Search, CMA-ES or Bayesian Optimization. We develop and test two different alternatives for this Exploitation Method: Random Walk and Bayesian Walk. We also discuss and implement other schemes, such as a Training Data Reduction Strategy and a Diversity Control strategy, in order to further improve the efficacy of our method. We performed several computational experiments on 8 different datasets to assess the effectiveness of the proposed approach. Results showed that HyperBRKGA could find hyperparameter configurations that outperformed in terms of predictive quality the baseline methods in 6 out of 8 datasets while showing a reasonable execution time. Lastly, we conducted an ablation study and showed that the addition of every component was relevant to achieving high quality results.
引用
收藏
页码:50752 / 50768
页数:17
相关论文
共 50 条
  • [1] Generalized Population-Based Training for Hyperparameter Optimization in Reinforcement Learning
    Bai, Hui
    Cheng, Ran
    IEEE TRANSACTIONS ON EMERGING TOPICS IN COMPUTATIONAL INTELLIGENCE, 2024, : 1 - 13
  • [2] Provably Efficient Online Hyperparameter Optimization with Population-Based Bandits
    Parker-Holder, Jack
    Nguyen, Vu
    Roberts, Stephen J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [3] Hyperparameter optimization of neural networks based on Q-learning
    Qi, Xin
    Xu, Bing
    SIGNAL IMAGE AND VIDEO PROCESSING, 2023, 17 (04) : 1669 - 1676
  • [4] Hyperparameter optimization of neural networks based on Q-learning
    Xin Qi
    Bing Xu
    Signal, Image and Video Processing, 2023, 17 : 1669 - 1676
  • [5] Hyperparameter Tuning for Deep Neural Networks Based Optimization Algorithm
    Vidyabharathi, D.
    Mohanraj, V.
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 36 (03): : 2559 - 2573
  • [6] HyperTube: A Framework for Population-Based Online Hyperparameter Optimization with Resource Constraints
    Jie, Renlong
    Gao, Junbin
    Vasnev, Andrey
    Tran, Minh-Ngoc
    IEEE ACCESS, 2020, 8 (08): : 69038 - 69057
  • [7] Parallel hyperparameter optimization of spiking neural networks
    Firmin, Thomas
    Boulet, Pierre
    Talbi, El-Ghazali
    NEUROCOMPUTING, 2024, 609
  • [8] An effective algorithm for hyperparameter optimization of neural networks
    Diaz, G. I.
    Fokoue-Nkoutche, A.
    Nannicini, G.
    Samulowitz, H.
    IBM JOURNAL OF RESEARCH AND DEVELOPMENT, 2017, 61 (4-5)
  • [9] Online Hyperparameter Optimization for Streaming Neural Networks
    Gunasekara, Nuwan
    Gomes, Heitor Murilo
    Pfahringer, Bernhard
    Bifet, Albert
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [10] Scour modeling using deep neural networks based on hyperparameter optimization
    Asim, Mohammed
    Rashid, Adnan
    Ahmad, Tanvir
    ICT EXPRESS, 2022, 8 (03): : 357 - 362