Enhanced harmony search for hyperparameter tuning of deep neural networks

被引:0
|
作者
Purnomo H.D. [1 ]
Gonsalves T. [2 ]
Wahyono T. [1 ]
Saian P.O.N. [1 ]
机构
[1] Department of Information Technology, Universitas Kristen Satya Wacana, Salatiga
[2] Department of Information and Communication Sciences, Sophia University, Tokyo
关键词
Configuration; Deep neural network; Harmony memory consideration rate; Harmony search; Rank-based selection;
D O I
10.1007/s00500-024-09840-7
中图分类号
学科分类号
摘要
The performance of a deep neural network is affected by its configuration as well as its training process. Determining the configuration of a DNN and training its parameters are challenging tasks due to high-dimensional problems. Therefore, there is a need for methods that can optimize the configuration and parameters of a DNN. Most of the existing DNN optimization research concerns the optimization of DNN parameters, and there are only a few studies discussing the optimization of DNN configuration. In this paper, enhanced harmony search is proposed to optimize the configuration of a fully connected neural network. The proposed harmony search enhancement is conducted by introducing various types of harmony memory consideration rate and various types of harmony memory selection. Four types of harmony memory consideration rate are proposed in this research: constant rate, linear increase rate, linear decrease rate, and sigmoid rate. Two types of harmony memory selection are proposed in this research: rank-based selection and random selection. The combination of types of harmony memory consideration rate and types of selection generates eight harmony search scenarios. The performance of the proposed method is compared to random search and genetic algorithm using 12 datasets of classification problems. The experiment results show that the proposed harmony search outperforms random search in 8 out of 12 problems and approximately has the same performance in 4 problems. Harmony search also outperforms genetic algorithm in five problems, approximately has the same performance in six problems, and has worse performance in one problem. In addition, combining various types of harmony memory consideration rate and rank-based selection increases the performance of the ordinary harmony search. The combination of harmony memory consideration with linear increase rate and rank-based selection performs the best among all combinations. It is better than the ordinary harmony search in seven problems, approximately equal in three problems, and worse in two problems. The results show that the proposed method has some advantages in solving classification problems using a DNN. First, the configuration of the DNN is represented as an optimization problem so that it can be used to find a specific FCNN configuration that is suitable for a specific problem. Second, the approach is a global optimization approach as it tunes the DNN hyperparameter (configuration) as well as the DNN parameter (connection weight). Therefore, it is able to find the best combination of DNN configuration as well as its connection weight. However, there is a need to develop a strategy to balance the hyperparameter tuning and the parameter tuning. Inappropriate balance could lead to a high computational cost. Future research can be directed to balance the hyperparameter and parameter tuning during the solution search. © The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature 2024.
引用
收藏
页码:9905 / 9919
页数:14
相关论文
共 50 条
  • [21] On the Use of Harmony Search Algorithm in the Training of Wavelet Neural Networks
    Lai, Kee Huong
    Zainuddin, Zarita
    Ong, Pauline
    22ND NATIONAL SYMPOSIUM ON MATHEMATICAL SCIENCES (SKSM22), 2015, 1682
  • [22] Training neural networks with harmony search algorithms for classification problems
    Kulluk, Sinem
    Ozbakir, Lale
    Baykasoglu, Adil
    ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2012, 25 (01) : 11 - 19
  • [23] Harmony Search Based Supervised Training of Artificial Neural Networks
    Kattan, Ali
    Abdullah, Rosni
    Salam, Rosalina Abdul
    UKSIM-AMSS FIRST INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS, MODELLING AND SIMULATION, 2010, : 105 - 110
  • [24] Scour modeling using deep neural networks based on hyperparameter optimization
    Asim, Mohammed
    Rashid, Adnan
    Ahmad, Tanvir
    ICT EXPRESS, 2022, 8 (03): : 357 - 362
  • [25] Search for deep graph neural networks
    Feng, Guosheng
    Wang, Hongzhi
    Wang, Chunnan
    INFORMATION SCIENCES, 2023, 649
  • [26] An Adapted GRASP Approach for Hyperparameter Search on Deep Networks Applied to Tabular Data
    Silva, Andersson A.
    Xavier, Amanda S.
    Macedo, David
    Zanchettin, Cleber
    Oliveira, Adriano L., I
    2022 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2022,
  • [27] Analysis of Hyperparameter Tuning in Neural Style Transfer
    Khandelwal, Siddhant
    Pandey, Kavita
    Rana, Sarthak
    Kaushik, Prashant
    2018 FIFTH INTERNATIONAL CONFERENCE ON PARALLEL, DISTRIBUTED AND GRID COMPUTING (IEEE PDGC), 2018, : 36 - 41
  • [28] Grid Search Hyperparameter Tuning in Additive Manufacturing Processes
    Ogunsanya, Michael
    Isichei, Joan
    Desai, Salil
    MANUFACTURING LETTERS, 2023, 35 : 1031 - 1042
  • [29] Grid Search Hyperparameter Tuning in Additive Manufacturing Processes
    Ogunsanya, Michael
    Isichei, Joan
    Desai, Salil
    MANUFACTURING LETTERS, 2023, 35 : 1031 - 1042
  • [30] Prediction of CBR by Deep Artificial Neural Networks with Hyperparameter Optimization by Simulated Annealing
    Yabi, Crespin Prudence
    Agongbe, Setondji Wadoscky
    Tamou, Bio Cheissou Koto
    Farsangi, Ehsan Noroozinejad
    Alamou, Eric
    Gibigaye, Mohamed
    INDIAN GEOTECHNICAL JOURNAL, 2024, 54 (6) : 2318 - 2334