Dropout Algorithms for Recurrent Neural Networks

被引:10
|
作者
Watt, Nathan [1 ]
du Plessis, Mathys C. [1 ]
机构
[1] Nelson Mandela Univ, Dept Comp Sci, POB 77000, ZA-6031 Port Elizabeth, South Africa
关键词
Deep Learning; Recurrent Neural Networks; Dropout;
D O I
10.1145/3278681.3278691
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In the last decade, hardware advancements have allowed for neural networks to become much larger in size. Dropout is a popular deep learning technique which has shown to improve the performance of large neural networks. Recurrent neural networks are powerful networks specialised at solving problems which use time series data. Three different approaches to incorporating Dropout with recurrent neural networks have been suggested. However, these approaches have not been evaluated under identical experimental conditions. This article investigates the performance of these Dropout approaches using a 2D physics simulation benchmark. After applying statistical tests it was found that using Dropout did improve network performance on the benchmark. However, contrary to the literature, the Dropout approach which was expected to perform poorly, performed well, and the approach which was expected to perform well, performed poorly.
引用
下载
收藏
页码:72 / 78
页数:7
相关论文
共 50 条
  • [1] Adversarial Dropout for Recurrent Neural Networks
    Park, Sungrae
    Song, Kyungwoo
    Ji, Mingi
    Lee, Wonsung
    Moon, Il-Chul
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 4699 - 4706
  • [2] Augmenting Recurrent Neural Networks Resilience by Dropout
    Bacciu, Davide
    Crecchi, Francesco
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2020, 31 (01) : 345 - 351
  • [3] A Theoretically Grounded Application of Dropout in Recurrent Neural Networks
    Gal, Yarin
    Ghahramani, Zoubin
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [4] Dropout improves Recurrent Neural Networks for Handwriting Recognition
    Vu Pham
    Bluche, Theodore
    Kermorvant, Christopher
    Louradour, Jerome
    2014 14TH INTERNATIONAL CONFERENCE ON FRONTIERS IN HANDWRITING RECOGNITION (ICFHR), 2014, : 285 - 290
  • [5] Recurrent Neural Networks algorithms and applications
    Chen, Yuexing
    Li, Jiarun
    2021 2ND INTERNATIONAL CONFERENCE ON BIG DATA & ARTIFICIAL INTELLIGENCE & SOFTWARE ENGINEERING (ICBASE 2021), 2021, : 38 - 43
  • [6] Parallelization of Algorithms with Recurrent Neural Networks
    Pedro Neto, Joao
    Silva, Fernando
    ADAPTIVE AND NATURAL COMPUTING ALGORITHMS, PT I, 2011, 6593 : 61 - 69
  • [7] Where to Apply Dropout in Recurrent Neural Networks for Handwriting Recognition?
    Bluche, Theodore
    Kermorvant, Christopher
    Louradour, Jerome
    2015 13TH IAPR INTERNATIONAL CONFERENCE ON DOCUMENT ANALYSIS AND RECOGNITION (ICDAR), 2015, : 681 - 685
  • [8] Kalman and gradation algorithms used in recurrent neural networks
    Chen, Wei
    Wu, Jie
    Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 1998, 26 (04): : 44 - 48
  • [9] Coevolution in recurrent neural networks using genetic algorithms
    Sato, Y
    Furuya, T
    SYSTEMS AND COMPUTERS IN JAPAN, 1996, 27 (05) : 64 - 73
  • [10] Online Arabic Handwriting Recognition with Dropout applied in Deep Recurrent Neural Networks
    Maalej, Rania
    Tagougui, Najiba
    Kherallah, Monji
    PROCEEDINGS OF 12TH IAPR WORKSHOP ON DOCUMENT ANALYSIS SYSTEMS, (DAS 2016), 2016, : 417 - 421