Automatic Configuration of Deep Neural Networks with Parallel Efficient Global Optimization

被引:13
|
作者
van Stein, Bas [1 ]
Wang, Hao [1 ]
Back, Thomas [1 ]
机构
[1] Leiden Univ, LIACS, Leiden, Netherlands
关键词
Deep Learning; Network Architectures; Bayesian Optimization; Optimization;
D O I
10.1109/ijcnn.2019.8851720
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Designing the architecture for an artificial neural network is a cumbersome task because of the numerous parameters to configure, including activation functions, layer types, and hyper-parameters. With the large number of parameters for most networks nowadays, it is intractable to find a good configuration for a given task by hand. In this paper the Mixed Integer Parallel Efficient Global Optimization (MIP-EGO) algorithm is proposed to automatically configure convolutional neural network architectures. It is shown that on several image classification tasks this approach is able to find competitive network architectures in terms of prediction accuracy, compared to the best hand-crafted ones in literature, when using only a fraction of the number of training epochs. Moreover, instead of the standard sequential evaluation in EGO, several candidate architectures are proposed and evaluated in parallel, which reduces the execution overhead significantly and leads to an efficient automation for deep neural network design.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] An Efficient Optimization Technique for Training Deep Neural Networks
    Mehmood, Faisal
    Ahmad, Shabir
    Whangbo, Taeg Keun
    [J]. MATHEMATICS, 2023, 11 (06)
  • [2] Targeted and Automatic Deep Neural Networks Optimization for Edge Computing
    Giovannesi, Luca
    Mattia, Gabriele Proietti
    Beraldi, Roberto
    [J]. ADVANCED INFORMATION NETWORKING AND APPLICATIONS, VOL 5, AINA 2024, 2024, 203 : 57 - 68
  • [3] A Parallel Optimization Method for Robustness Verification of Deep Neural Networks
    Lin, Renhao
    Zhou, Qinglei
    Nan, Xiaofei
    Hu, Tianqing
    [J]. MATHEMATICS, 2024, 12 (12)
  • [4] An efficient global optimization of neural networks by using hybrid method
    Cho, Yong-Hyun
    Hong, Seong-Jun
    [J]. PROCEEDINGS OF THE FRONTIERS IN THE CONVERGENCE OF BIOSCIENCE AND INFORMATION TECHNOLOGIES, 2007, : 807 - 812
  • [5] Full Approximation of Deep Neural Networks through Efficient Optimization
    De la Parra, Cecilia
    Guntoro, Andre
    Kumar, Akash
    [J]. 2020 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2020,
  • [6] DELAFO: An Efficient Portfolio Optimization Using Deep Neural Networks
    Hieu K Cao
    Han K Cao
    Binh T Nguyen
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2020, PT I, 2020, 12084 : 623 - 635
  • [7] Efficient Hardware Optimization Strategies for Deep Neural Networks Acceleration Chip
    Zhang Meng
    Zhang Jingwei
    Li Guoqing
    Wu Ruixia
    Zeng Xiaoyang
    [J]. JOURNAL OF ELECTRONICS & INFORMATION TECHNOLOGY, 2021, 43 (06) : 1510 - 1517
  • [8] Asynchronous Optimization Methods for Efficient Training of Deep Neural Networks with Guarantees
    Kungurtsev, Vyacheslav
    Egan, Malcolm
    Chatterjee, Bapi
    Alistarh, Dan
    [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 8209 - 8216
  • [9] OptZConfig: Efficient Parallel Optimization of Lossy Compression Configuration
    Underwood, Robert
    Calhoun, Jon C.
    Di, Sheng
    Apon, Amy
    Cappello, Franck
    [J]. IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (12) : 3505 - 3519
  • [10] AutoLrOpt: An Efficient Optimizer Using Automatic Setting of Learning Rate for Deep Neural Networks
    Merrouchi, Mohamed
    Atifi, Khalid
    Skittou, Mustapha
    Benyoussef, Youssef
    Gadi, Taoufiq
    [J]. IEEE ACCESS, 2024, 12 : 83154 - 83168