State-Frequency Memory Recurrent Neural Networks

被引:0
|
作者
Hu, Hao [1 ]
Qi, Guo-Jun [1 ]
机构
[1] Univ Cent Florida, Orlando, FL 32816 USA
关键词
LSTM; LONG;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Modeling temporal sequences plays a fundamental role in various modern applications and has drawn more and more attentions in the machine learning community. Among those efforts on improving the capability to represent temporal data, the Long Short-Term Memory (LSTM) has achieved great success in many areas. Although the LSTM can capture long-range dependency in the time domain, it does not explicitly model the pattern occurrences in the frequency domain that plays an important role in tracking and predicting data points over various time cycles. We propose the State-Frequency Memory (SFM), a novel recurrent architecture that allows to separate dynamic patterns across different frequency components and their impacts on modeling the temporal contexts of input sequences. By jointly decomposing memorized dynamics into state-frequency components, the SFM is able to offer a fine-grained analysis of temporal sequences by capturing the dependency of uncovered patterns in both time and frequency domains. Evaluations on several temporal modeling tasks demonstrate the SFM can yield competitive performances, in particular as compared with the state-of-the-art LS TM models.
引用
收藏
页数:10
相关论文
共 50 条
  • [21] Memory Analysis for Memristors and Memristive Recurrent Neural Networks
    Gang Bao
    Yide Zhang
    Zhigang Zeng
    IEEE/CAAJournalofAutomaticaSinica, 2020, 7 (01) : 96 - 105
  • [22] Memory in linear recurrent neural networks in continuous time
    Hermans, Michiel
    Schrauwen, Benjamin
    NEURAL NETWORKS, 2010, 23 (03) : 341 - 355
  • [23] Encoding-based memory for recurrent neural networks
    Carta, Antonio
    Sperduti, Alessandro
    Bacciu, Davide
    NEUROCOMPUTING, 2021, 456 (456) : 407 - 420
  • [24] Memory analysis for memristors and memristive recurrent neural networks
    Bao, Gang
    Zhang, Yide
    Zeng, Zhigang
    IEEE-CAA JOURNAL OF AUTOMATICA SINICA, 2020, 7 (01) : 96 - 105
  • [25] Associative memory by recurrent neural networks with delay elements
    Miyoshi, S
    Yanai, HF
    Okada, M
    ICONIP'02: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING: COMPUTATIONAL INTELLIGENCE FOR THE E-AGE, 2002, : 70 - 74
  • [26] Associative memory by recurrent neural networks with delay elements
    Miyoshi, S
    Yanai, HF
    Okada, M
    NEURAL NETWORKS, 2004, 17 (01) : 55 - 63
  • [27] Neural Mechanisms of Working Memory Accuracy Revealed by Recurrent Neural Networks
    Xie, Yuanqi
    Liu, Yichen Henry
    Constantinidis, Christos
    Zhou, Xin
    FRONTIERS IN SYSTEMS NEUROSCIENCE, 2022, 16
  • [28] UNDERSTANDING RECURRENT NEURAL STATE USING MEMORY SIGNATURES
    Koppula, Skanda
    Sim, Khe Chai
    Chin, Kean
    2018 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2018, : 2396 - 2400
  • [29] State Estimation for Recurrent Neural Networks With Intermittent Transmission
    Liu, Chang
    Rao, Hongxia
    Yu, Xinxin
    Xu, Yong
    Su, Chun-Yi
    IEEE TRANSACTIONS ON CYBERNETICS, 2024, 54 (05) : 2891 - 2900
  • [30] On the Interpretation of Recurrent Neural Networks as Finite State Machines
    Oliva, Christian
    Lago-Fernandez, Luis F.
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2019: THEORETICAL NEURAL COMPUTATION, PT I, 2019, 11727 : 312 - 323