Regularising neural networks using flexible multivariate activation function

被引:23
|
作者
Solazzi, M
Uncini, A
机构
[1] Univ Roma La Sapienza, Dipartimento INFOCOM, I-00184 Rome, Italy
[2] Univ Ancona, Dipartimento Elettron & Automat, I-60131 Ancona, Italy
关键词
neural networks; spline neural networks; multilayer perceptron; generalised sigmoidal functions; adaptive activation functions; spline; regularisation; generalisation;
D O I
10.1016/S0893-6080(03)00189-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a new general neural structure based on nonlinear flexible multivariate function that can be viewed in the framework of the generalised regularisation net-works theory. The proposed architecture is based on multi-dimensional adaptive cubic spline basis activation function that collects information from the previous network layer in aggregate form. In other words, each activation function represents a spline function of a subset of previous layer outputs so the number of network connections (structural complexity) can be very low with respect to the problem complexity. A specific learning algorithm, based on the adaptation of local parameters of the activation function, is derived. This fact improve the network generalisation capabilities and speed up the convergence of the learning process. At last, some experimental results demonstrating the effectiveness of the proposed architecture, are presented. (C) 2003 Elsevier Ltd. All rights reserved.
引用
收藏
页码:247 / 260
页数:14
相关论文
共 50 条
  • [1] Regularized Flexible Activation Function Combination for Deep Neural Networks
    Jie, Renlong
    Gao, Junbin
    Vasnev, Andrey
    Tran, Minh-ngoc
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 2001 - 2008
  • [2] Sound synthesis by flexible activation function recurrent neural networks
    Uncini, A
    [J]. NEURAL NETS, 2002, 2486 : 168 - 177
  • [3] RECURRENT NEURAL NETWORKS WITH FLEXIBLE GATES USING KERNEL ACTIVATION FUNCTIONS
    Scardapane, Simone
    Van Vaerenbergh, Steven
    Comminiello, Danilo
    Totaro, Simone
    Uncini, Aurelio
    [J]. 2018 IEEE 28TH INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING (MLSP), 2018,
  • [4] Periodic Function as Activation Function for Neural Networks
    Xu, Ding
    Guan, Yue
    Cai, Ping-ping
    [J]. INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE: TECHNIQUES AND APPLICATIONS, AITA 2016, 2016, : 179 - 183
  • [5] Software reliability prediction using neural networks with linear activation function
    Misra, RB
    Sasatte, PV
    [J]. ADVANCED RELIABILITY MODELING, 2004, : 333 - 340
  • [6] Neural networks with asymmetric activation function for function approximation
    Gomes, Gecynalda S. da S.
    Ludermir, Teresa B.
    Almeida, Leandro M.
    [J]. IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 2310 - 2317
  • [7] Multistability of neural networks with discontinuous activation function
    Huang, Gan
    Cao, Jinde
    [J]. COMMUNICATIONS IN NONLINEAR SCIENCE AND NUMERICAL SIMULATION, 2008, 13 (10) : 2279 - 2289
  • [8] Adaptive Morphing Activation Function for Neural Networks
    Herrera-Alcantara, Oscar
    Arellano-Balderas, Salvador
    [J]. FRACTAL AND FRACTIONAL, 2024, 8 (08)
  • [9] Activation function of wavelet chaotic neural networks
    Xu, Yao-Qun
    Sun, Ming
    Guo, Meng-Shu
    [J]. PROCEEDINGS OF THE FIFTH IEEE INTERNATIONAL CONFERENCE ON COGNITIVE INFORMATICS, VOLS 1 AND 2, 2006, : 716 - 721
  • [10] Neural networks with adaptive spline activation function
    Campolucci, P
    Capparelli, F
    Guarnieri, S
    Piazza, F
    Uncini, A
    [J]. MELECON '96 - 8TH MEDITERRANEAN ELECTROTECHNICAL CONFERENCE, PROCEEDINGS, VOLS I-III: INDUSTRIAL APPLICATIONS IN POWER SYSTEMS, COMPUTER SCIENCE AND TELECOMMUNICATIONS, 1996, : 1442 - 1445