RECURRENT NEURAL NETWORKS WITH FLEXIBLE GATES USING KERNEL ACTIVATION FUNCTIONS

被引:0
|
作者
Scardapane, Simone [1 ]
Van Vaerenbergh, Steven [2 ]
Comminiello, Danilo [1 ]
Totaro, Simone [1 ]
Uncini, Aurelio [1 ]
机构
[1] Sapienza Univ Rome, Rome, Italy
[2] Univ Cantabria, Santander, Spain
关键词
Recurrent network; LSTM; GRU; Gate; Kernel activation function;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Gated recurrent neural networks have achieved remarkable results in the analysis of sequential data. Inside these networks, gates are used to control the flow of information, allowing to model even very long-term dependencies in the data. In this paper, we investigate whether the original gate equation (a linear projection followed by an element-wise sigmoid) can be improved. In particular, we design a more flexible architecture, with a small number of adaptable parameters, which is able to model a wider range of gating functions than the classical one. To this end, we replace the sigmoid function in the standard gate with a non-parametric formulation extending the recently proposed kernel activation function (KAF), with the addition of a residual skip-connection. A set of experiments on sequential variants of the MNIST dataset shows that the adoption of this novel gate allows to improve accuracy with a negligible cost in terms of computational power and with a large speed-up in the number of training iterations.
引用
收藏
页数:6
相关论文
共 50 条
  • [21] Multistability analysis of delayed recurrent neural networks with a class of piecewise nonlinear activation functions
    Liu, Yang
    Wang, Zhen
    Ma, Qian
    Shen, Hao
    NEURAL NETWORKS, 2022, 152 : 80 - 89
  • [22] Scalable Partial Explainability in Neural Networks via Flexible Activation Functions (Student Abstract)
    Sun, Schyler C.
    Li, Chen
    Wei, Zhuangkun
    Tsourdos, Antonios
    Guo, Weisi
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 15899 - 15900
  • [23] Wavelets as activation functions in Neural Networks
    Herrera, Oscar
    Priego, Belem
    JOURNAL OF INTELLIGENT & FUZZY SYSTEMS, 2022, 42 (05) : 4345 - 4355
  • [24] Shape and vibration control of a multibody flexible structure using recurrent neural networks
    Bernelli-Zazzera, Franco
    Ercoli-Finzi, Amalia
    Romano, Marcello
    Tomasi, Massimo
    Journal of the Chinese Society of Mechanical Engineers, Transactions of the Chinese Institute of Engineers, Series C/Chung-Kuo Chi Hsueh Kung Ch'eng Hsuebo Pao, 2000, 21 (01): : 67 - 76
  • [25] Improvement of joint optimization of masks and deep recurrent neural networks for monaural speech separation using optimized activation functions
    MASOOD Asim
    YE Zhongfu
    Chinese Journal of Acoustics, 2020, 39 (03) : 420 - 432
  • [26] Simple activation functions for neural and fuzzy neural networks
    Mendil, B
    Benmahammed, K
    ISCAS '99: PROCEEDINGS OF THE 1999 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL 5: SYSTEMS, POWER ELECTRONICS, AND NEURAL NETWORKS, 1999, : 347 - 350
  • [27] Simple activation functions for neural and fuzzy neural networks
    Mendil, Boubekeur
    Benmahammed, K.
    Proceedings - IEEE International Symposium on Circuits and Systems, 1999, 5
  • [28] Constructive feedforward neural networks using hermite polynomial activation functions
    Ma, LY
    Khorasani, K
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 2005, 16 (04): : 821 - 833
  • [29] Universal Approximation Using Probabilistic Neural Networks with Sigmoid Activation Functions
    Murugadoss, R.
    Ramakrishnan, M.
    2014 INTERNATIONAL CONFERENCE ON ADVANCES IN ENGINEERING AND TECHNOLOGY RESEARCH (ICAETR), 2014,
  • [30] Ordinal Classification Using Hybrid Artificial Neural Networks with Projection and Kernel Basis Functions
    Dorado-Moreno, M.
    Gutierrez, P. A.
    Hervas-Martinez, C.
    HYBRID ARTIFICIAL INTELLIGENT SYSTEMS, PT II, 2012, 7209 : 319 - 330