Backpropagation through Time Algorithm for Training Recurrent Neural Networks using Variable Length Instances

被引:0
|
作者
Grau, Isel [1 ]
Napoles, Gonzalo [1 ]
Bonet, Isis [2 ]
Garcia, Maria Matilde [1 ]
机构
[1] Univ Cent Marta Abreu Las Villas, Ctr Estudios Informat, Las Villas, Cuba
[2] Escuela Ingn Antioquia, Antioquia, Colombia
来源
COMPUTACION Y SISTEMAS | 2013年 / 17卷 / 01期
关键词
Recurrent neural networks; backpropagation through time; sequence analysis; bioinformatics; artificial earthquakes;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Artificial Neural Networks (ANNs) are grouped within connectionist techniques of Artificial Intelligence. In particular, Recurrent Neural Networks are a type of ANN which is widely used in signal reproduction tasks and sequence analysis, where causal relationships in time and space take place. On the other hand, in many problems of science and engineering, signals or sequences under analysis do not always have the same length, making it difficult to select a computational technique for information processing. This article presents a flexible implementation of Recurrent Neural Networks which allows designing the desired topology based on specific application problems. Furthermore, the proposed model is capable of learning to use knowledge bases with instances of variable length in an efficient manner. The performance of the suggested implementation is evaluated through a study case of bioinformatics sequence classification. We also mention its application in obtaining artificial earthquakes from seismic scenarios similar to Cuba.
引用
收藏
页码:15 / 24
页数:10
相关论文
共 50 条
  • [31] Combination of breeding swarm optimization and backpropagation algorithm for training recurrent fuzzy neural network
    Bahrampour, Soheil
    Ghorbanpour, Sahand
    Ramezani, Amin
    ICINCO 2008: PROCEEDINGS OF THE FIFTH INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS, VOL ICSO: INTELLIGENT CONTROL SYSTEMS AND OPTIMIZATION, 2008, : 314 - +
  • [32] An EM Based Training Algorithm for Recurrent Neural Networks
    Unkelbach, Jan
    Yi, Sun
    Schmidhuber, Juergen
    ARTIFICIAL NEURAL NETWORKS - ICANN 2009, PT I, 2009, 5768 : 964 - 974
  • [33] Neighborhood based modified backpropagation algorithm using adaptive learning parameters for training feedforward neural networks
    Kathirvalavakumar, T.
    Subavathi, S. Jeyaseeli
    NEUROCOMPUTING, 2009, 72 (16-18) : 3915 - 3921
  • [34] Spike-Train Level Backpropagation for Training Deep Recurrent Spiking Neural Networks
    Zhang, Wenrui
    Li, Peng
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 32 (NIPS 2019), 2019, 32
  • [35] Training recurrent neural networks by using parallel recursive prediction error algorithm
    Chen, DQ
    Chan, LW
    ICONIP'98: THE FIFTH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING JOINTLY WITH JNNS'98: THE 1998 ANNUAL CONFERENCE OF THE JAPANESE NEURAL NETWORK SOCIETY - PROCEEDINGS, VOLS 1-3, 1998, : 1393 - 1396
  • [36] Accelerating convolutional neural network training using ProMoD backpropagation algorithm
    Gurhanli, Ahmet
    IET IMAGE PROCESSING, 2020, 14 (13) : 2957 - 2964
  • [37] Generalized backpropagation through time for continuous time neural networks and discrete time measurements
    Fujarewicz, K
    Galuszka, A
    ARTIFICIAL INTELLIGENCE AND SOFT COMPUTING - ICAISC 2004, 2004, 3070 : 190 - 196
  • [38] Suitable Recurrent Neural Network for Air Quality Prediction With Backpropagation Through Time
    Septiawan, Widya Mas
    Endah, Sukmawati Nur
    2018 2ND INTERNATIONAL CONFERENCE ON INFORMATICS AND COMPUTATIONAL SCIENCES (ICICOS), 2018, : 196 - 201
  • [39] Backpropagation through nonlinear units for the all-optical training of neural networks
    Guo, Xianxin
    Barrett, Thomas D.
    Wang, Zhiming M.
    Lvovsky, A., I
    PHOTONICS RESEARCH, 2021, 9 (03) : B71 - B80
  • [40] Backpropagation through nonlinear units for the all-optical training of neural networks
    XIANXIN GUO
    THOMAS D.BARRETT
    ZHIMING M.WANG
    A.I.LVOVSKY
    Photonics Research, 2021, 9 (03) : 262 - 271