Learning various length dependence by dual recurrent neural networks

被引:3
|
作者
Zhang, Chenpeng [1 ]
Li, Shuai [2 ]
Ye, Mao [1 ]
Zhu, Ce [2 ]
Li, Xue [3 ]
机构
[1] Univ Elect Sci & Technol China, Sch Comp Sci & Engn, Chengdu 611731, Peoples R China
[2] Univ Elect Sci & Technol China, Sch Informat & Commun Engn, Chengdu 611731, Peoples R China
[3] Univ Queensland, Sch Informat Technol & Elect Engn, Brisbane, Qld 4072, Australia
基金
中国国家自然科学基金; 国家重点研发计划;
关键词
Sequence learning; Recurrent neural networks; Long-term; Dependence separating;
D O I
10.1016/j.neucom.2021.09.043
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recurrent neural networks (RNNs) are widely used as a memory model for sequence-related problems. Many variants of RNN have been proposed to solve the gradient problems of training RNNs and process long sequences. Although some classical models have been proposed, capturing long-term dependence while responding to short-term changes remains a challenge. To address this problem, we propose a new model named Dual Recurrent Neural Networks (DuRNN). The DuRNN consists of two parts to learn the short-term dependence and progressively learn the long-term dependence. The first part is a recurrent neural network with constrained full recurrent connections to deal with short-term dependence in sequence and generate short-term memory. Another part is a recurrent neural network with independent recurrent connections which helps to learn long-term dependence and generate long-term memory. A selection mechanism is added between two parts to transfer the needed long-term information to the independent neurons. Multiple modules can be stacked to form a multi-layer model for better performance. Our contributions are: 1) a new recurrent model developed based on the divide-and-conquer strategy to learn long and short-term dependence separately, and 2) a selection mechanism to enhance the separating and learning of different temporal scales of dependence. Both theoretical analysis and extensive experiments are conducted to validate the performance of our model. Experimental results indicate that the proposed DuRNN model can handle not only very long sequences (over 5,000 time steps), but also short sequences very well. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 50 条
  • [31] Learning Morphological Transformations with Recurrent Neural Networks
    Biswas, Saurav
    Breuel, Thomas
    INNS CONFERENCE ON BIG DATA 2015 PROGRAM, 2015, 53 : 335 - 344
  • [32] Learning Multiple Timescales in Recurrent Neural Networks
    Alpay, Tayfun
    Heinrich, Stefan
    Wermter, Stefan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I, 2016, 9886 : 132 - 139
  • [33] Dual-level diagnostic feature learning with recurrent neural networks for treatment sequence recommendation
    Min, Xin
    Li, Wei
    Yang, Jinzhao
    Xie, Weidong
    Zhao, Dazhe
    JOURNAL OF BIOMEDICAL INFORMATICS, 2022, 134
  • [34] General transient length upper bound for recurrent neural networks
    Ho, AMCL
    DeWilde, P
    FROM NATURAL TO ARTIFICIAL NEURAL COMPUTATION, 1995, 930 : 202 - 208
  • [35] Learning algorithms and the shape of the learning surface in recurrent neural networks
    Watanabe, Tatsumi
    Uchikawa, Yoshiki
    Gouhara, Kazutoshi
    Systems and Computers in Japan, 1992, 23 (13): : 90 - 107
  • [36] On the improvement of the real time recurrent learning algorithm for recurrent neural networks
    Mak, MW
    Ku, KW
    Lu, YL
    NEUROCOMPUTING, 1999, 24 (1-3) : 13 - 36
  • [37] On the Learning Capabilities of Recurrent Neural Networks: A Cryptographic Perspective
    Srivastava, Shivin
    Bhatia, Ashutosh
    2018 9TH IEEE INTERNATIONAL CONFERENCE ON BIG KNOWLEDGE (ICBK), 2018, : 162 - 167
  • [38] Equality index and learning in recurrent fuzzy neural networks
    Ballini, R
    Gomide, F
    PROCEEDINGS OF THE 12TH IEEE INTERNATIONAL CONFERENCE ON FUZZY SYSTEMS, VOLS 1 AND 2, 2003, : 155 - 160
  • [39] Learning Rule for Associative Memory in Recurrent Neural Networks
    Jacob, Theju
    Snyder, Wesley
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,
  • [40] Residual Recurrent Neural Networks for Learning Sequential Representations
    Yue, Boxuan
    Fu, Junwei
    Liang, Jun
    INFORMATION, 2018, 9 (03)