Enhanced harmony search for hyperparameter tuning of deep neural networks

被引:0
|
作者
Purnomo H.D. [1 ]
Gonsalves T. [2 ]
Wahyono T. [1 ]
Saian P.O.N. [1 ]
机构
[1] Department of Information Technology, Universitas Kristen Satya Wacana, Salatiga
[2] Department of Information and Communication Sciences, Sophia University, Tokyo
关键词
Configuration; Deep neural network; Harmony memory consideration rate; Harmony search; Rank-based selection;
D O I
10.1007/s00500-024-09840-7
中图分类号
学科分类号
摘要
The performance of a deep neural network is affected by its configuration as well as its training process. Determining the configuration of a DNN and training its parameters are challenging tasks due to high-dimensional problems. Therefore, there is a need for methods that can optimize the configuration and parameters of a DNN. Most of the existing DNN optimization research concerns the optimization of DNN parameters, and there are only a few studies discussing the optimization of DNN configuration. In this paper, enhanced harmony search is proposed to optimize the configuration of a fully connected neural network. The proposed harmony search enhancement is conducted by introducing various types of harmony memory consideration rate and various types of harmony memory selection. Four types of harmony memory consideration rate are proposed in this research: constant rate, linear increase rate, linear decrease rate, and sigmoid rate. Two types of harmony memory selection are proposed in this research: rank-based selection and random selection. The combination of types of harmony memory consideration rate and types of selection generates eight harmony search scenarios. The performance of the proposed method is compared to random search and genetic algorithm using 12 datasets of classification problems. The experiment results show that the proposed harmony search outperforms random search in 8 out of 12 problems and approximately has the same performance in 4 problems. Harmony search also outperforms genetic algorithm in five problems, approximately has the same performance in six problems, and has worse performance in one problem. In addition, combining various types of harmony memory consideration rate and rank-based selection increases the performance of the ordinary harmony search. The combination of harmony memory consideration with linear increase rate and rank-based selection performs the best among all combinations. It is better than the ordinary harmony search in seven problems, approximately equal in three problems, and worse in two problems. The results show that the proposed method has some advantages in solving classification problems using a DNN. First, the configuration of the DNN is represented as an optimization problem so that it can be used to find a specific FCNN configuration that is suitable for a specific problem. Second, the approach is a global optimization approach as it tunes the DNN hyperparameter (configuration) as well as the DNN parameter (connection weight). Therefore, it is able to find the best combination of DNN configuration as well as its connection weight. However, there is a need to develop a strategy to balance the hyperparameter tuning and the parameter tuning. Inappropriate balance could lead to a high computational cost. Future research can be directed to balance the hyperparameter and parameter tuning during the solution search. © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2024.
引用
收藏
页码:9905 / 9919
页数:14
相关论文
共 50 条
  • [1] Optimal hyperparameter tuning of convolutional neural networks based on the parameter-setting-free harmony search algorithm
    Lee, Woo-Young
    Park, Seung-Min
    Sim, Kwee-Bo
    OPTIK, 2018, 172 : 359 - 367
  • [2] DeepHyper: Asynchronous Hyperparameter Search for Deep Neural Networks
    Balaprakash, Prasanna
    Salim, Michael
    Uram, Thomas D.
    Vishwanath, Venkat
    Wild, Stefan M.
    2018 IEEE 25TH INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPUTING (HIPC), 2018, : 42 - 51
  • [3] Hyperparameter Tuning for Deep Neural Networks Based Optimization Algorithm
    Vidyabharathi, D.
    Mohanraj, V.
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 36 (03): : 2559 - 2573
  • [4] Fine-tuning Deep Belief Networks using Harmony Search
    Papa, Joao Paulo
    Scheirer, Walter
    Cox, David Daniel
    APPLIED SOFT COMPUTING, 2016, 46 : 875 - 885
  • [5] Fine-Tuning Convolutional Neural Networks Using Harmony Search
    Rosa, Gustavo
    Papa, Joao
    Marana, Aparecido
    Scheirer, Walter
    Cox, David
    PROGRESS IN PATTERN RECOGNITION, IMAGE ANALYSIS, COMPUTER VISION, AND APPLICATIONS, CIARP 2015, 2015, 9423 : 683 - 690
  • [7] Automatic Hyperparameter Tuning in Deep Convolutional Neural Networks Using Asynchronous Reinforcement Learning
    Neary, Patrick L.
    2018 IEEE INTERNATIONAL CONFERENCE ON COGNITIVE COMPUTING (ICCC), 2018, : 73 - 77
  • [8] HyperNOMAD: Hyperparameter Optimization of Deep Neural Networks Using Mesh Adaptive Direct Search
    Lakhmiri, Dounia
    Le Digabel, Sebastien
    Tribes, Christophe
    ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2021, 47 (03):
  • [9] Basic Enhancement Strategies When Using Bayesian Optimization for Hyperparameter Tuning of Deep Neural Networks
    Cho, Hyunghun
    Kim, Yongjin
    Lee, Eunjung
    Choi, Daeyoung
    Lee, Yongjae
    Rhee, Wonjong
    IEEE ACCESS, 2020, 8 : 52588 - 52608
  • [10] An Empirical Study of the Impact of Hyperparameter Tuning and Model Optimization on the Performance Properties of Deep Neural Networks
    Liao, Lizhi
    Li, Heng
    Shang, Weiyi
    Ma, Lei
    ACM TRANSACTIONS ON SOFTWARE ENGINEERING AND METHODOLOGY, 2022, 31 (03)