Convergence Analysis of PSO for Hyper-Parameter Selection in Deep Neural Networks

被引:5
|
作者
Nalepa, Jakub [1 ,2 ]
Lorenzo, Pablo Ribalta [1 ]
机构
[1] Future Proc, Gliwice, Poland
[2] Silesian Tech Univ, Gliwice, Poland
关键词
Convergence analysis; PSO; Hyper; parameter selection; DNNs;
D O I
10.1007/978-3-319-69835-9_27
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep Neural Networks (DNNs) have gained enormous research attention since they consistently outperform other state-of-the-art methods in a plethora of machine learning tasks. However, their performance strongly depends on the DNN hyper-parameters which are commonly tuned by experienced practitioners. Recently, we introduced Particle Swarm Optimization (PSO) and parallel PSO techniques to automate this process. In this work, we theoretically and experimentally investigate the convergence capabilities of these algorithms. The experiments were performed for several DNN architectures (both gradually augmented and hand-crafted by a human) using two challenging multi-class benchmark datasets-MNIST and CIFAR-10.
引用
收藏
页码:284 / 295
页数:12
相关论文
共 50 条
  • [41] HYPER-PARAMETER OPTIMIZATION FOR CONVOLUTIONAL NEURAL NETWORK COMMITTEES BASED ON EVOLUTIONARY ALGORITHMS
    Bochinski, Erik
    Senst, Tobias
    Sikora, Thomas
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 3924 - 3928
  • [42] Hyper-parameter optimization of deep learning model for prediction of Parkinson's disease
    Kaur, Sukhpal
    Aggarwal, Himanshu
    Rani, Rinkle
    MACHINE VISION AND APPLICATIONS, 2020, 31 (05)
  • [43] Cultural Events Classification using Hyper-parameter Optimization of Deep Learning Technique
    Feng Zhipeng
    Gani, Hamdan
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2021, 12 (05) : 603 - 609
  • [44] A GPU Scheduling Framework to Accelerate Hyper-Parameter Optimization in Deep Learning Clusters
    Son, Jaewon
    Yoo, Yonghyuk
    Kim, Khu-rai
    Kim, Youngjae
    Lee, Kwonyong
    Park, Sungyong
    ELECTRONICS, 2021, 10 (03) : 1 - 15
  • [45] Hyper-parameter optimization of deep learning model for prediction of Parkinson’s disease
    Sukhpal Kaur
    Himanshu Aggarwal
    Rinkle Rani
    Machine Vision and Applications, 2020, 31
  • [46] Hyper-parameter optimization in neural-based translation systems: A case study
    Datta, Goutam
    Joshi, Nisheeth
    Gupta, Kusum
    INTERNATIONAL JOURNAL ON SMART SENSING AND INTELLIGENT SYSTEMS, 2023, 16 (01):
  • [47] Ensemble Adaptation Networks with low-cost unsupervised hyper-parameter search
    Zhang, Haotian
    Ding, Shifei
    Jia, Weikuan
    PATTERN ANALYSIS AND APPLICATIONS, 2020, 23 (03) : 1215 - 1224
  • [48] Ensemble Adaptation Networks with low-cost unsupervised hyper-parameter search
    Haotian Zhang
    Shifei Ding
    Weikuan Jia
    Pattern Analysis and Applications, 2020, 23 : 1215 - 1224
  • [49] Object Detection Using Deep Convolutional Generative Adversarial Networks Embedded Single Shot Detector with Hyper-parameter Optimization
    Dinakaran, Ranjith
    Zhang, Li
    2021 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (IEEE SSCI 2021), 2021,
  • [50] The Impact of Hyper-Parameter Tuning for Landscape-Aware Performance Regression and Algorithm Selection
    Jankovic, Anja
    Popovski, Gorjan
    Eftimov, Tome
    Doerr, Carola
    PROCEEDINGS OF THE 2021 GENETIC AND EVOLUTIONARY COMPUTATION CONFERENCE (GECCO'21), 2021, : 687 - 696