Learning to Adaptively Scale Recurrent Neural Networks

被引:0
|
作者
Hu, Hao [1 ]
Wang, Liqiang [1 ]
Qi, Guo-Jun [2 ]
机构
[1] Univ Cent Florida, Orlando, FL 32816 USA
[2] Huawei Cloud, Shenzhen, Peoples R China
基金
美国国家科学基金会;
关键词
LONG-TERM DEPENDENCIES;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Recent advancements in recurrent neural network (RNN) research have demonstrated the superiority of utilizing multiscale structures in learning temporal representations of time series. Currently, most of multiscale RNNs use fixed scales, which do not comply with the nature of dynamical temporal patterns among sequences. In this paper, we propose Adaptively Scaled Recurrent Neural Networks (ASRNN), a simple but efficient way to handle this problem. Instead of using predefined scales, ASRNNs are able to learn and adjust scales based on different temporal contexts, making them more flexible in modeling multiscale patterns. Compared with other multiscale RNNs, ASRNNs are bestowed upon dynamical scaling capabilities with much simpler structures, and are easy to be integrated with various RNN cells. The experiments on multiple sequence modeling tasks indicate ASRNNs can efficiently adapt scales based on different sequence contexts and yield better performances than baselines without dynamical scaling abilities.
引用
收藏
页码:3822 / 3829
页数:8
相关论文
共 50 条
  • [21] Learning Morphological Transformations with Recurrent Neural Networks
    Biswas, Saurav
    Breuel, Thomas
    INNS CONFERENCE ON BIG DATA 2015 PROGRAM, 2015, 53 : 335 - 344
  • [22] Learning Multiple Timescales in Recurrent Neural Networks
    Alpay, Tayfun
    Heinrich, Stefan
    Wermter, Stefan
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2016, PT I, 2016, 9886 : 132 - 139
  • [23] Quantum recurrent neural networks for sequential learning
    Li, Yanan
    Wang, Zhimin
    Han, Rongbing
    Shi, Shangshang
    Li, Jiaxin
    Shang, Ruimin
    Zheng, Haiyong
    Zhong, Guoqiang
    Gu, Yongjian
    NEURAL NETWORKS, 2023, 166 : 148 - 161
  • [24] Learning minimal automata with recurrent neural networks
    Aichernig, Bernhard K.
    Koenig, Sandra
    Mateis, Cristinel
    Pferscher, Andrea
    Tappler, Martin
    SOFTWARE AND SYSTEMS MODELING, 2024, 23 (03): : 625 - 655
  • [26] A CONVERGENCE RESULT FOR LEARNING IN RECURRENT NEURAL NETWORKS
    KUAN, CM
    HORNIK, K
    WHITE, H
    NEURAL COMPUTATION, 1994, 6 (03) : 420 - 440
  • [27] Learning algorithms and the shape of the learning surface in recurrent neural networks
    Watanabe, Tatsumi
    Uchikawa, Yoshiki
    Gouhara, Kazutoshi
    Systems and Computers in Japan, 1992, 23 (13): : 90 - 107
  • [28] On the improvement of the real time recurrent learning algorithm for recurrent neural networks
    Mak, MW
    Ku, KW
    Lu, YL
    NEUROCOMPUTING, 1999, 24 (1-3) : 13 - 36
  • [29] On the Learning Capabilities of Recurrent Neural Networks: A Cryptographic Perspective
    Srivastava, Shivin
    Bhatia, Ashutosh
    2018 9TH IEEE INTERNATIONAL CONFERENCE ON BIG KNOWLEDGE (ICBK), 2018, : 162 - 167
  • [30] Learning Rule for Associative Memory in Recurrent Neural Networks
    Jacob, Theju
    Snyder, Wesley
    2015 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2015,