RECURRENT NEURAL NETWORKS WITH FLEXIBLE GATES USING KERNEL ACTIVATION FUNCTIONS

被引:0
|
作者
Scardapane, Simone [1 ]
Van Vaerenbergh, Steven [2 ]
Comminiello, Danilo [1 ]
Totaro, Simone [1 ]
Uncini, Aurelio [1 ]
机构
[1] Sapienza Univ Rome, Rome, Italy
[2] Univ Cantabria, Santander, Spain
关键词
Recurrent network; LSTM; GRU; Gate; Kernel activation function;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gated recurrent neural networks have achieved remarkable results in the analysis of sequential data. Inside these networks, gates are used to control the flow of information, allowing to model even very long-term dependencies in the data. In this paper, we investigate whether the original gate equation (a linear projection followed by an element-wise sigmoid) can be improved. In particular, we design a more flexible architecture, with a small number of adaptable parameters, which is able to model a wider range of gating functions than the classical one. To this end, we replace the sigmoid function in the standard gate with a non-parametric formulation extending the recently proposed kernel activation function (KAF), with the addition of a residual skip-connection. A set of experiments on sequential variants of the MNIST dataset shows that the adoption of this novel gate allows to improve accuracy with a negligible cost in terms of computational power and with a large speed-up in the number of training iterations.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Recurrent neural networks with trainable amplitude of activation functions
    Goh, SL
    Mandic, DP
    NEURAL NETWORKS, 2003, 16 (08) : 1095 - 1100
  • [2] Learning Activation Functions by Means of Kernel Based Neural Networks
    Marra, Giuseppe
    Zanca, Dario
    Betti, Alessandro
    Gori, Marco
    ADVANCES IN ARTIFICIAL INTELLIGENCE, AI*IA 2019, 2019, 11946 : 418 - 430
  • [3] Sound synthesis by flexible activation function recurrent neural networks
    Uncini, A
    NEURAL NETS, 2002, 2486 : 168 - 177
  • [4] Flexible Recurrent Neural Networks
    Lambert, Anne
    Le Bolzer, Francoise
    Schnitzler, Francois
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2020, PT I, 2021, 12457 : 694 - 709
  • [5] RBF neural networks for classification using new kernel functions
    Bozdogan, H
    Liu, ZQ
    COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2002, : 17 - 22
  • [6] Complete stability of delayed recurrent neural networks with Gaussian activation functions
    Liu, Peng
    Zeng, Zhigang
    Wang, Jun
    NEURAL NETWORKS, 2017, 85 : 21 - 32
  • [7] Multistability of Delayed Recurrent Neural Networks with Mexican Hat Activation Functions
    Liu, Peng
    Zeng, Zhigang
    Wang, Jun
    NEURAL COMPUTATION, 2017, 29 (02) : 423 - 457
  • [8] Stability of Stochastic Recurrent Neural Networks with Positive Linear Activation Functions
    Liao, Wudai
    Yang, Xuezhao
    Wang, Zhongsheng
    ADVANCES IN NEURAL NETWORKS - ISNN 2009, PT 1, PROCEEDINGS, 2009, 5551 : 279 - +
  • [9] Temporal-Kernel Recurrent Neural Networks
    Sutskever, Ilya
    Hinton, Geoffrey
    NEURAL NETWORKS, 2010, 23 (02) : 239 - 243
  • [10] Recurrent neural network synthesis using interaction activation functions
    Novakovic, BM
    1996 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, PROCEEDINGS, VOLS 1-4, 1996, : 1608 - 1613