State-Frequency Memory Recurrent Neural Networks

被引:0
|
作者
Hu, Hao [1 ]
Qi, Guo-Jun [1 ]
机构
[1] Univ Cent Florida, Orlando, FL 32816 USA
关键词
LSTM; LONG;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modeling temporal sequences plays a fundamental role in various modern applications and has drawn more and more attentions in the machine learning community. Among those efforts on improving the capability to represent temporal data, the Long Short-Term Memory (LSTM) has achieved great success in many areas. Although the LSTM can capture long-range dependency in the time domain, it does not explicitly model the pattern occurrences in the frequency domain that plays an important role in tracking and predicting data points over various time cycles. We propose the State-Frequency Memory (SFM), a novel recurrent architecture that allows to separate dynamic patterns across different frequency components and their impacts on modeling the temporal contexts of input sequences. By jointly decomposing memorized dynamics into state-frequency components, the SFM is able to offer a fine-grained analysis of temporal sequences by capturing the dependency of uncovered patterns in both time and frequency domains. Evaluations on several temporal modeling tasks demonstrate the SFM can yield competitive performances, in particular as compared with the state-of-the-art LS TM models.
引用
收藏
页数:10
相关论文
共 50 条
  • [31] Learning State Space Trajectories in Recurrent Neural Networks
    Pearlmutter, Barak A.
    NEURAL COMPUTATION, 1989, 1 (02) : 263 - 269
  • [32] FARM: A Flexible Accelerator for Recurrent and Memory Augmented Neural Networks
    Nagadastagiri Challapalle
    Sahithi Rampalli
    Nicholas Jao
    Akshaykrishna Ramanathan
    John Sampson
    Vijaykrishnan Narayanan
    Journal of Signal Processing Systems, 2020, 92 : 1247 - 1261
  • [33] A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks
    Frady, E. Paxon
    Kleyko, Denis
    Sommer, Friedrich T.
    NEURAL COMPUTATION, 2018, 30 (06) : 1449 - 1513
  • [34] Bio-inspired memory generation by recurrent neural networks
    Bedia, Manuel G.
    Corchado, Juan M.
    Castillo, Luis F.
    COMPUTATIONAL AND AMBIENT INTELLIGENCE, 2007, 4507 : 55 - +
  • [35] Gating Recurrent Enhanced Memory Neural Networks on Language Identification
    Geng, Wang
    Zhao, Yuanyan
    Wang, Wenfu
    Cai, Xinyuan
    Xu, Bo
    17TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2016), VOLS 1-5: UNDERSTANDING SPEECH PROCESSING IN HUMANS AND MACHINES, 2016, : 3280 - 3284
  • [36] LightRNN: Memory and Computation-Efficient Recurrent Neural Networks
    Li, Xiang
    Qin, Tao
    Yang, Jian
    Liu, Tie-Yan
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 29 (NIPS 2016), 2016, 29
  • [37] Recurrent Neural Networks Improve Classification of Episodic Memory Encoding
    Arora, Akshay
    Segar, Sarah
    Umbach, Gray
    Lega, Bradley
    NEUROSURGERY, 2018, 65 : 92 - 92
  • [38] Empirical Analysis of Limits for Memory Distance in Recurrent Neural Networks
    Illium, Steffen
    Schillman, Thore
    Mueller, Robert
    Gabor, Thomas
    Linnhoff-Popien, Claudia
    ICAART: PROCEEDINGS OF THE 14TH INTERNATIONAL CONFERENCE ON AGENTS AND ARTIFICIAL INTELLIGENCE - VOL 3, 2022, : 308 - 315
  • [39] Recurrent Neural Networks with External Memory for Spoken Language Understanding
    Peng, Baolin
    Yao, Kaisheng
    Jing, Li
    Wong, Kam-Fai
    NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2015, 2015, 9362 : 25 - 35
  • [40] Eigenvalue Normalized Recurrent Neural Networks for Short Term Memory
    Helfrich, Kyle
    Ye, Qiang
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 4115 - 4122