A Population-Based Hybrid Approach for Hyperparameter Optimization of Neural Networks

被引:7
|
作者
Japa, Luis [1 ]
Serqueira, Marcello [2 ]
Mendonca, Israel [1 ]
Aritsugi, Masayoshi [3 ]
Bezerra, Eduardo [2 ]
Gonzalez, Pedro Henrique [4 ]
机构
[1] Kumamoto Univ, Grad Sch Sci & Technol, Kumamoto 8608555, Japan
[2] Fed Ctr Technol Educ Rio De Janeiro CEFET RJ, BR-20271110 Rio De Janeiro, Brazil
[3] Kumamoto Univ, Fac Adv Sci & Technol, Kumamoto 8608555, Japan
[4] Univ Fed Rio de Janeiro, Syst Engn & Comp Sci Postgrad Program, BR-21941914 Rio De Janeiro, Brazil
关键词
Genetic algorithms; hyperparameter optimization; machine learning; KEY GENETIC ALGORITHM;
D O I
10.1109/ACCESS.2023.3277310
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Hyperparameter optimization is a fundamental part of Auto Machine Learning (AutoML) and it has been widely researched in recent years; however, it still remains as one of the main challenges in this area. Motivated by the need of faster and more accurate hyperparameter optimization algorithms we developed HyperBRKGA, a new population-based approach for hyperparameter optimization. HyperBRKGA combines the Biased Random Key Genetic Algorithm with an Exploitation Method in order to search the hyperparameter space more efficiently than other commonly used hyperparameter optimization algorithms, such as Grid Search, Random Search, CMA-ES or Bayesian Optimization. We develop and test two different alternatives for this Exploitation Method: Random Walk and Bayesian Walk. We also discuss and implement other schemes, such as a Training Data Reduction Strategy and a Diversity Control strategy, in order to further improve the efficacy of our method. We performed several computational experiments on 8 different datasets to assess the effectiveness of the proposed approach. Results showed that HyperBRKGA could find hyperparameter configurations that outperformed in terms of predictive quality the baseline methods in 6 out of 8 datasets while showing a reasonable execution time. Lastly, we conducted an ablation study and showed that the addition of every component was relevant to achieving high quality results.
引用
收藏
页码:50752 / 50768
页数:17
相关论文
共 50 条
  • [11] Hybrid approach to complexity optimization of neural networks
    Lee, H
    Jee, T
    Park, H
    Lee, Y
    8TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING, VOLS 1-3, PROCEEDING, 2001, : 1455 - 1460
  • [12] Hybrid Approach for TSP Based on Neural Networks and Ant Colony Optimization
    Mueller, Carsten
    Kiehne, Niklas
    2015 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI), 2015, : 1431 - 1435
  • [13] Hyperparameter Optimization Techniques for Designing Software Sensors Based on Artificial Neural Networks
    Blume, Sebastian
    Benedens, Tim
    Schramm, Dieter
    SENSORS, 2021, 21 (24)
  • [14] A Population-Based Hybrid Extremal Optimization Algorithm
    Chen, Yu
    Zhang, Kai
    Zou, Xiufen
    BIO-INSPIRED COMPUTING AND APPLICATIONS, 2012, 6840 : 410 - 417
  • [15] Population-Based Hyperparameter Tuning With Multitask Collaboration
    Li, Wendi
    Wang, Ting
    Ng, Wing W. Y.
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (09) : 5719 - 5731
  • [16] Efficient Hyperparameter Optimization for Convolution Neural Networks in Deep Learning: A Distributed Particle Swarm Optimization Approach
    Guo, Yu
    Li, Jian-Yu
    Zhan, Zhi-Hui
    CYBERNETICS AND SYSTEMS, 2020, 52 (01) : 36 - 57
  • [17] Hyperparameter Optimization for Convolutional Neural Networks with Genetic Algorithms and Bayesian Optimization
    Puentes G, David E.
    Barrios H, Carlos J.
    Navaux, Philippe O. A.
    2022 IEEE LATIN AMERICAN CONFERENCE ON COMPUTATIONAL INTELLIGENCE (LA-CCI), 2022, : 131 - 135
  • [18] Base Hybrid Approach for TSP Based on Neural Networks and Ant Colony Optimization
    Mueller, Carsten
    Kiehne, Niklas
    INTELLIGENT AND EVOLUTIONARY SYSTEMS, IES 2015, 2016, 5 : 219 - 226
  • [19] Speeding up the Hyperparameter Optimization of Deep Convolutional Neural Networks
    Hinz, Tobias
    Navarro-Guerrero, Nicolas
    Magg, Sven
    Wermter, Stefan
    INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2018, 17 (02)
  • [20] Multiagent Reinforcement Learning for Hyperparameter Optimization of Convolutional Neural Networks
    Iranfar, Arman
    Zapater, Marina
    Atienza, David
    IEEE TRANSACTIONS ON COMPUTER-AIDED DESIGN OF INTEGRATED CIRCUITS AND SYSTEMS, 2022, 41 (04) : 1034 - 1047