Recurrent Neural Networks With Finite Memory Length

被引:7
|
作者
Long, Dingkun [1 ,2 ]
Zhang, Richong [1 ,2 ]
Mao, Yongyi [3 ]
机构
[1] Beihang Univ, BDBC, Beijing 100191, Peoples R China
[2] Beihang Univ, Sch Comp Sci & Engn, SKLSDE Lab, Beijing 100191, Peoples R China
[3] Univ Ottawa, Sch Elect Engn & Comp Sci, Ottawa, ON KN56N2, Canada
来源
IEEE ACCESS | 2019年 / 7卷
基金
中国国家自然科学基金;
关键词
Recurrent neural networks; memory length;
D O I
10.1109/ACCESS.2018.2890297
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The working of recurrent neural networks has not been well understood to date. The construction of such network models, hence, largely relies on heuristics and intuition. This paper formalizes the notion of "memory length" for recurrent networks and consequently discovers a generic family of recurrent networks having maximal memory lengths. Stacking such networks into multiple layers is shown to result in powerful models, including the gated convolutional networks. We show that the structure of such networks potentially enables a more principled design approach in practice and entails no gradient vanishing or exploding during back-propagation. We also present a new example in this family, termed attentive activation recurrent unit (AARU). Experimentally we demonstrate that the performance of this network family, particularly AARU, is superior to the LSTM and GRU networks.
引用
收藏
页码:12511 / 12520
页数:10
相关论文
共 50 条
  • [41] Recurrent Neural Networks Improve Classification of Episodic Memory Encoding
    Arora, Akshay
    Segar, Sarah
    Umbach, Gray
    Lega, Bradley
    NEUROSURGERY, 2018, 65 : 92 - 92
  • [42] Empirical Analysis of Limits for Memory Distance in Recurrent Neural Networks
    Illium, Steffen
    Schillman, Thore
    Mueller, Robert
    Gabor, Thomas
    Linnhoff-Popien, Claudia
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 3, 2022, : 308 - 315
  • [43] Recurrent Neural Networks with External Memory for Spoken Language Understanding
    Peng, Baolin
    Yao, Kaisheng
    Jing, Li
    Wong, Kam-Fai
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2015, 2015, 9362 : 25 - 35
  • [44] Eigenvalue Normalized Recurrent Neural Networks for Short Term Memory
    Helfrich, Kyle
    Ye, Qiang
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4115 - 4122
  • [45] FARM: A Flexible Accelerator for Recurrent and Memory Augmented Neural Networks
    Challapalle, Nagadastagiri
    Rampalli, Sahithi
    Jao, Nicholas
    Ramanathan, Akshaykrishna
    Sampson, John
    Narayanan, Vijaykrishnan
    JOURNAL OF SIGNAL PROCESSING SYSTEMS FOR SIGNAL IMAGE AND VIDEO TECHNOLOGY, 2020, 92 (11): : 1247 - 1261
  • [46] MEMORY VISUALIZATION FOR GATED RECURRENT NEURAL NETWORKS IN SPEECH RECOGNITION
    Tang, Zhiyuan
    Shi, Ying
    Wang, Dong
    Feng, Yang
    Zhang, Shiyue
    2017 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2017, : 2736 - 2740
  • [47] Accelerating Recurrent Neural Networks: A Memory-Efficient Approach
    Wang, Zhisheng
    Lin, Jun
    Wang, Zhongfeng
    IEEE TRANSACTIONS ON VERY LARGE SCALE INTEGRATION (VLSI) SYSTEMS, 2017, 25 (10) : 2763 - 2775
  • [48] Using Stigmergy as a Computational Memory in the Design of Recurrent Neural Networks
    Galatolo, Federico A.
    Cimino, Mario G. C. A.
    Vaglini, Gigliola
    ICPRAM: PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS, 2019, : 830 - 836
  • [49] FINITE SIGNAL TRANSMISSION TIMES AND SYNAPTIC MEMORY IN NEURAL NETWORKS
    KRUGER, U
    MARTIENSSEN, W
    RISCHKE, DH
    PHYSICAL REVIEW E, 1995, 51 (05): : 5040 - 5047
  • [50] Correction to: Excitable networks for finite state computation with continuous time recurrent neural networks
    Peter Ashwin
    Claire Postlethwaite
    Biological Cybernetics, 2022, 116 : 117 - 117