Recurrent Neural Networks algorithms and applications

被引:3
|
作者
Chen, Yuexing [1 ]
Li, Jiarun [2 ]
机构
[1] State Univ New Jersey, Rutgers Business Sch, New Brunswick, NJ 07102 USA
[2] Sch Beijing Inst Petrochem Technol, Beijing, Peoples R China
关键词
Recurrent Neural Networks; review; applications;
D O I
10.1109/ICBASE53849.2021.00015
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Since recurrent neural networks (RNNs) were firstly proposed, it is widely used, and many extended RNNs algorithms have been developed, which achieve good results in many application fields. To report the latest research results of RNNs and help researchers quickly understand the latest progress of RNNs algorithm. In this research, we briefly introduce the basic principle of RNN, review more than 20 papers about RNNs and summarize the previous work. Then, we make qualitative and quantitative analyses on four representative algorithms (CNN-RNN, ResNet-110, ResNet-164, and ResNet-1001). Among them, the model of CNN-RNN obtains the best performance on the CIFAR-100 dataset with standard data augmentation. Finally, we discuss the future development trend of RNNs.
引用
收藏
页码:38 / 43
页数:6
相关论文
共 50 条
  • [1] Parallelization of Algorithms with Recurrent Neural Networks
    Pedro Neto, Joao
    Silva, Fernando
    [J]. ADAPTIVE AND NATURAL COMPUTING ALGORITHMS, PT I, 2011, 6593 : 61 - 69
  • [2] Dropout Algorithms for Recurrent Neural Networks
    Watt, Nathan
    du Plessis, Mathys C.
    [J]. PROCEEDINGS OF THE ANNUAL CONFERENCE OF THE SOUTH AFRICAN INSTITUTE OF COMPUTER SCIENTISTS AND INFORMATION TECHNOLOGISTS (SAICSIT 2018), 2018, : 72 - 78
  • [3] Neural networks: Algorithms and applications
    Liu, Derong
    Zhang, Huaguang
    Hu, Sanqing
    [J]. NEUROCOMPUTING, 2008, 71 (4-6) : 471 - 473
  • [4] Kalman and gradation algorithms used in recurrent neural networks
    Chen, Wei
    Wu, Jie
    [J]. Huanan Ligong Daxue Xuebao/Journal of South China University of Technology (Natural Science), 1998, 26 (04): : 44 - 48
  • [5] Coevolution in recurrent neural networks using genetic algorithms
    Sato, Y
    Furuya, T
    [J]. SYSTEMS AND COMPUTERS IN JAPAN, 1996, 27 (05) : 64 - 73
  • [6] DYNAMIC RECURRENT NEURAL NETWORKS - THEORY AND APPLICATIONS
    GILES, CL
    KUHN, GM
    WILLIAMS, RJ
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS, 1994, 5 (02): : 153 - 156
  • [7] Inversion of feedforward neural networks: Algorithms and applications
    Jensen, CA
    Reed, RD
    Marks, RJ
    El-Sharkawi, MA
    Jung, JB
    Miyamoto, RT
    Anderson, GM
    Eggen, CJ
    [J]. PROCEEDINGS OF THE IEEE, 1999, 87 (09) : 1536 - 1549
  • [8] Learning algorithms and the shape of the learning surface in recurrent neural networks
    Watanabe, Tatsumi
    Uchikawa, Yoshiki
    Gouhara, Kazutoshi
    [J]. Systems and Computers in Japan, 1992, 23 (13): : 90 - 107
  • [9] A bounded exploration approach to constructive algorithms for recurrent neural networks
    Boné, R
    Crucianu, M
    Verley, G
    de Beauville, JPA
    [J]. IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL III, 2000, : 27 - 32
  • [10] Recurrent Convolutional Neural Networks Learn Succinct Learning Algorithms
    Goel, Surbhi
    Kakade, Sham
    Kalai, Adam Tauman
    Zhang, Cyril
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,