Optimizing Deep Feedforward Neural Network Architecture: A Tabu Search Based Approach

被引:40
|
作者
Gupta, Tarun Kumar [1 ]
Raza, Khalid [1 ]
机构
[1] Jamia Millia Islamia, Dept Comp Sci, New Delhi 110025, India
关键词
Tabu search (TS); Deep feedforward neural network (DFNN); Hidden layer; Hidden neurons; Optimization; Architecture; OPTIMIZATION METHODOLOGY; GENETIC ALGORITHM; PARAMETERS; WEIGHTS; DESIGN;
D O I
10.1007/s11063-020-10234-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The optimal architecture of a deep feedforward neural network (DFNN) is essential for its better accuracy and faster convergence. Also, the training of DFNN becomes tedious as the depth of the network increases. The DFNN can be tweaked using several parameters, such as the number of hidden layers, the number of hidden neurons at each hidden layer, and the number of connections between layers. The optimal architecture of DFNN is usually set using a trial-and-error process, which is an exponential combinatorial problem and a tedious task. To address this problem, we need an algorithm that can automatically design an optimal architecture with improved generalization ability. This work aims to propose a new methodology that can simultaneously optimize the number of hidden layers and their respective neurons for DFNN. This work combines the advantages of Tabu search and Gradient descent with a momentum backpropagation training algorithm. The proposed approach has been tested on four different classification benchmark datasets, which show better generalization ability of the optimized networks.
引用
收藏
页码:2855 / 2870
页数:16
相关论文
共 50 条
  • [1] Optimizing Deep Feedforward Neural Network Architecture: A Tabu Search Based Approach
    Tarun Kumar Gupta
    Khalid Raza
    [J]. Neural Processing Letters, 2020, 51 : 2855 - 2870
  • [2] Optimizing weights of neural network using an adaptive tabu search approach
    He, Y
    Qiu, YH
    Liu, GY
    Lei, KY
    [J]. ADVANCES IN NEURAL NETWORKS - ISNN 2005, PT 1, PROCEEDINGS, 2005, 3496 : 672 - 676
  • [3] Optimizing feedforward artificial neural network architecture
    Benardos, P. G.
    Vosniakos, G. -C.
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2007, 20 (03) : 365 - 382
  • [4] Neural Architecture Search for Optimizing Deep Belief Network Models of fMRI Data
    Qiang, Ning
    Ge, Bao
    Dong, Qinglin
    Ge, Fangfei
    Liu, Tianming
    [J]. MULTISCALE MULTIMODAL MEDICAL IMAGING, MMMI 2019, 2020, 11977 : 26 - 34
  • [5] Deep Learning and Neural Architecture Search for Optimizing Binary Neural Network Image Super Resolution
    Su, Yuanxin
    Ang, Li-minn
    Seng, Kah Phooi
    Smith, Jeremy
    [J]. BIOMIMETICS, 2024, 9 (06)
  • [6] BatTS: a hybrid method for optimizing deep feedforward neural network
    Pan, Sichen
    Gupta, Tarun Kumar
    Raza, Khalid
    [J]. PEERJ COMPUTER SCIENCE, 2023, 9
  • [7] Deep neural network architecture search using network morphism
    Kwasigroch, Arkadiusz
    Grochowski, Michal
    Mikolajczyk, Mateusz
    [J]. 2019 24TH INTERNATIONAL CONFERENCE ON METHODS AND MODELS IN AUTOMATION AND ROBOTICS (MMAR), 2019, : 30 - 35
  • [8] Recurrent Neural Architecture Search based on Randomness-Enhanced Tabu Algorithm
    Hu, Kai
    Tian, Shuo
    Guo, Shasha
    Li, Nan
    Luo, Li
    Wang, Lei
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [9] A new architecture selection method based on tabu search for artificial neural networks
    Aladag, Cagdas Hakan
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2011, 38 (04) : 3287 - 3293
  • [10] Modeling of Feedforward Neural Network in PAHRA Architecture
    Vokorokos, Liberios
    Adam, Norbert
    [J]. PROCEEDINGS OF THE 9TH WSEAS INTERNATIONAL CONFERENCE ON SIMULATION, MODELLING AND OPTIMIZATION, 2009, : 446 - +