Evolutionary Optimization of Hyperparameters in Deep Learning Models

被引:0
|
作者
Kim, Jin-Young [1 ]
Cho, Sung-Bae [1 ]
机构
[1] Yonsei Univ, Dept Comp Sci, Seoul, South Korea
关键词
Genetic programming; deep learning; neural networks; activation function; optimization technique; NEURAL-NETWORKS;
D O I
10.1109/cec.2019.8790354
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Recently, deep learning is one of the most popular techniques in artificial intelligence. However, to construct a deep learning model, various components must be set up, including activation functions, optimization methods, a configuration of model structure called hyperparameters. As they affect the performance of deep learning, researchers are working hard to find optimal hyperparameters when solving problems with deep learning. Activation function and optimization technique play a crucial role in the forward and backward processes of model learning, but they are set up in a heuristic way. The previous studies have been conducted to optimize either activation function or optimization technique, while the relationship between them is neglected to search them at the same time. In this paper, we propose a novel method based on genetic programming to simultaneously find the optimal activation functions and optimization techniques. In genetic programming, each individual is composed of two chromosomes, one for the activation function and the other for the optimization technique. To calculate the fitness of one individual, we construct a neural network with the activation function and optimization technique that the individual represents. The deep learning model found through our method has 82.59% and 53.04% of accuracies for the CIFAR-10 and CIFAR-100 datasets, which outperforms the conventional methods. Moreover, we analyze the activation function found and confirm the usefulness of the proposed method.
引用
收藏
页码:831 / 837
页数:7
相关论文
共 50 条
  • [41] Bayesian Optimization of Hyperparameters in Kernel-Based Delay Rational Models
    Treviso F.
    Trinchero R.
    Canavero F.G.
    IEEE Electromagnetic Compatibility Magazine, 2021, 10 (02) : 90 - 93
  • [42] Investigating the Effects of Hyperparameters in Quantum-Enhanced Deep Reinforcement Learning
    Fikadu Tilaye, Getahun
    Pandey, Amit
    Quantum Engineering, 2023, 2023
  • [43] Photovoltaic Power Forecast Using Deep Learning Techniques with Hyperparameters Based on Bayesian Optimization: A Case Study in the Galapagos Islands
    Guanoluisa, Richard
    Arcos-Aviles, Diego
    Flores-Calero, Marco
    Martinez, Wilmar
    Guinjoan, Francesc
    SUSTAINABILITY, 2023, 15 (16)
  • [44] Radiogenomic Prediction of MGMT Using Deep Learning with Bayesian Optimized Hyperparameters
    Farzana, Walia
    Temtam, Ahmed G.
    Shboul, Zeina A.
    Rahman, M. Monibor
    Sadique, M. Shibly
    Iftekharuddin, Khan M.
    BRAINLESION: GLIOMA, MULTIPLE SCLEROSIS, STROKE AND TRAUMATIC BRAIN INJURIES, BRAINLES 2021, PT II, 2022, 12963 : 357 - 366
  • [45] Rice Leaf Diseases Recognition Based on Deep Learning and Hyperparameters Customization
    Hoang, Van-Dung
    FRONTIERS OF COMPUTER VISION, IW-FCV 2021, 2021, 1405 : 189 - 200
  • [46] Bayesian Optimization of Hyperparameters in Kernel-Based Delay Rational Models
    Treviso, Felipe
    Trinchero, Riccardo
    Canavero, Flavio G.
    SPI 2021: 25TH IEEE WORKSHOP ON SIGNAL AND POWER INTEGRITY, 2021,
  • [47] Optimization of Hyperparameters in Object Detection Models Based on Fractal Loss Function
    Zhou, Ming
    Li, Bo
    Wang, Jue
    FRACTAL AND FRACTIONAL, 2022, 6 (12)
  • [48] Improving Genomic Prediction with Machine Learning Incorporating TPE for Hyperparameters Optimization
    Liang, Mang
    An, Bingxing
    Li, Keanning
    Du, Lili
    Deng, Tianyu
    Cao, Sheng
    Du, Yueying
    Xu, Lingyang
    Gao, Xue
    Zhang, Lupei
    Li, Junya
    Gao, Huijiang
    BIOLOGY-BASEL, 2022, 11 (11):
  • [49] Toward Auto-Learning Hyperparameters for Deep Learning-Based Recommender Systems
    Sun, Bo
    Wu, Di
    Shang, Mingsheng
    He, Yi
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT II, 2022, : 323 - 331
  • [50] MODRL/D-EL: Multiobjective Deep Reinforcement Learning with Evolutionary Learning for Multiobjective Optimization
    Zhang, Yongxin
    Wang, Jiahai
    Zhang, Zizhen
    Zhou, Yalan
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,