Automatically Avoiding Overfitting in Deep Neural Networks by Using Hyper-Parameters Optimization Methods

被引:1
|
作者
Kadhim, Zahraa Saddi [1 ]
Abdullah, Hasanen S. [1 ]
Ghathwan, Khalil I. [1 ]
机构
[1] Univ Technol Baghdad, Dept Comp Sci, Baghdad, Iraq
关键词
deep learning; hyper-parameters optimization; regularization; overfitting; SEARCH;
D O I
10.3991/ijoe.v19i05.38153
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Overfitting is one issue that deep learning faces in particular. It leads to highly accurate classification results, but they are fraudulent. As a result, if the overfitting problem is not fully resolved, systems that rely on prediction or recognition and are sensitive to accuracy will produce untrustworthy results. All prior suggestions helped to lessen this issue but fell short of eliminating it entirely while maintaining crucial data. This paper proposes a novel approach to guarantee the preservation of critical data while eliminating overfitting completely. Numeric and image datasets are employed in two types of networks: convolutional and deep neural networks. Following the usage of three regular-ization techniques (L1, L2, and dropout), apply two optimization algorithms (Bayesian and random search), allowing them to select the hyperparameters automatically, with regularization techniques being one of the hyperparameters that are automatically selected. The obtained results, in addition to completely eliminating the overfitting issue, showed that the accuracy of the image data was 97.82% and 90.72% when using Bayesian and random search techniques, respec-tively, and was 95.3% and 96.5% when using the same algorithms with a numeric dataset.
引用
收藏
页码:146 / 162
页数:17
相关论文
共 50 条
  • [1] Exploiting Parameters Learning for Hyper-parameters Optimization in Deep Neural Networks
    Fraccaroli, Michele
    Lamma, Evelina
    Riguzzi, Fabrizio
    ELECTRONIC PROCEEDINGS IN THEORETICAL COMPUTER SCIENCE, 2022, 364
  • [2] Constraints on Hyper-parameters in Deep Learning Convolutional Neural Networks
    Al-Saggaf, Ubaid M.
    Botalb, Abdelaziz
    Faisal, Muhammad
    Moinuddin, Muhammad
    Alsaggaf, Abdulrahman U.
    Alfakeh, Sulhi Ali
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (11) : 439 - 449
  • [3] Deep-learning: investigating deep neural networks hyper-parameters and comparison of performance to shallow methods for modeling bioactivity data
    Koutsoukas, Alexios
    Monaghan, Keith J.
    Li, Xiaoli
    Huan, Jun
    JOURNAL OF CHEMINFORMATICS, 2017, 9
  • [4] Deep-learning: investigating deep neural networks hyper-parameters and comparison of performance to shallow methods for modeling bioactivity data
    Alexios Koutsoukas
    Keith J. Monaghan
    Xiaoli Li
    Jun Huan
    Journal of Cheminformatics, 9
  • [5] The Feature Speed Formula: a flexible approach to scale hyper-parameters of deep neural networks
    Chizat, Lénaïc
    Netrapalli, Praneeth
    arXiv, 2023,
  • [6] Continuous optimization of hyper-parameters
    Bengio, Y
    IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL I, 2000, : 305 - 310
  • [7] Avoiding Overfitting: A Survey on Regularization Methods for Convolutional Neural Networks
    Goncalves Dos Santos, Claudio Filipi
    Papa, Joao Paulo
    ACM COMPUTING SURVEYS, 2022, 54 (10S)
  • [8] Deep stochastic configuration networks with optimised model and hyper-parameters
    Felicetti, Matthew J.
    Wang, Dianhui
    INFORMATION SCIENCES, 2022, 600 : 431 - 441
  • [9] Subset Selection for Tuning of Hyper-parameters in Artificial Neural Networks
    Aki, K. K. Emre
    Erkoc, Tugba
    Eskil, M. Taner
    2017 24TH IEEE INTERNATIONAL CONFERENCE ON ELECTRONICS, CIRCUITS AND SYSTEMS (ICECS), 2017, : 144 - 147
  • [10] Convolutional Neural Network Hyper-Parameters Optimization based on Genetic Algorithms
    Loussaief, Sehla
    Abdelkrim, Afef
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2018, 9 (10) : 252 - 266