Dropout Algorithms for Recurrent Neural Networks

被引:10
|
作者
Watt, Nathan [1 ]
du Plessis, Mathys C. [1 ]
机构
[1] Nelson Mandela Univ, Dept Comp Sci, POB 77000, ZA-6031 Port Elizabeth, South Africa
关键词
Deep Learning; Recurrent Neural Networks; Dropout;
D O I
10.1145/3278681.3278691
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In the last decade, hardware advancements have allowed for neural networks to become much larger in size. Dropout is a popular deep learning technique which has shown to improve the performance of large neural networks. Recurrent neural networks are powerful networks specialised at solving problems which use time series data. Three different approaches to incorporating Dropout with recurrent neural networks have been suggested. However, these approaches have not been evaluated under identical experimental conditions. This article investigates the performance of these Dropout approaches using a 2D physics simulation benchmark. After applying statistical tests it was found that using Dropout did improve network performance on the benchmark. However, contrary to the literature, the Dropout approach which was expected to perform poorly, performed well, and the approach which was expected to perform well, performed poorly.
引用
下载
收藏
页码:72 / 78
页数:7
相关论文
共 50 条
  • [21] Recurrent Neural Networks Based Online Learning Algorithms for Distributed Systems
    Ergen, Tolga
    Sahin, S. Onur
    Kozat, S. Serdar
    2018 26TH SIGNAL PROCESSING AND COMMUNICATIONS APPLICATIONS CONFERENCE (SIU), 2018,
  • [22] A unified framework of online learning algorithms for training recurrent neural networks
    Marschall, Owen
    Cho, Kyunghyun
    Savin, Cristina
    Journal of Machine Learning Research, 2020, 21
  • [23] Evolutionary algorithms that generate recurrent neural networks for learning chaos dynamics
    Sato, Y
    Nagaya, S
    1996 IEEE INTERNATIONAL CONFERENCE ON EVOLUTIONARY COMPUTATION (ICEC '96), PROCEEDINGS OF, 1996, : 144 - 149
  • [24] Minimal model dimension/order determination algorithms for recurrent neural networks
    Wang, Jeen-Shing
    Hsu, Yu-Liang
    Lin, Hung-Yi
    Chen, Yen-Ping
    PATTERN RECOGNITION LETTERS, 2009, 30 (09) : 812 - 819
  • [25] A Unified Framework of Online Learning Algorithms for Training Recurrent Neural Networks
    Marschall, Owen
    Cho, Kyunghyun
    Savin, Cristina
    JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [26] CONCURRENT ASYNCHRONOUS LEARNING ALGORITHMS FOR MASSIVELY PARALLEL RECURRENT NEURAL NETWORKS
    WU, CH
    TSAI, JH
    JOURNAL OF PARALLEL AND DISTRIBUTED COMPUTING, 1992, 14 (03) : 345 - 353
  • [27] Adding learning to cellular genetic algorithms for training recurrent neural networks
    Ku, KWC
    Mak, MW
    Siu, WC
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (02): : 239 - 252
  • [28] Checkerboard Dropout: A Structured Dropout With Checkerboard Pattern for Convolutional Neural Networks
    Nguyen, Khanh-Binh
    Choi, Jaehyuk
    Yang, Joon-Sung
    IEEE ACCESS, 2022, 10 : 76044 - 76054
  • [29] Towards dropout training for convolutional neural networks
    Wu, Haibing
    Gu, Xiaodong
    NEURAL NETWORKS, 2015, 71 : 1 - 10
  • [30] Variational Bayesian dropout with a Gaussian prior for recurrent neural networks application in rainfall-runoff modeling
    Tabas, S. Sadeghi
    Samadi, S.
    ENVIRONMENTAL RESEARCH LETTERS, 2022, 17 (06):