Regularising neural networks using flexible multivariate activation function

被引:23
|
作者
Solazzi, M
Uncini, A
机构
[1] Univ Roma La Sapienza, Dipartimento INFOCOM, I-00184 Rome, Italy
[2] Univ Ancona, Dipartimento Elettron & Automat, I-60131 Ancona, Italy
关键词
neural networks; spline neural networks; multilayer perceptron; generalised sigmoidal functions; adaptive activation functions; spline; regularisation; generalisation;
D O I
10.1016/S0893-6080(03)00189-8
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper presents a new general neural structure based on nonlinear flexible multivariate function that can be viewed in the framework of the generalised regularisation net-works theory. The proposed architecture is based on multi-dimensional adaptive cubic spline basis activation function that collects information from the previous network layer in aggregate form. In other words, each activation function represents a spline function of a subset of previous layer outputs so the number of network connections (structural complexity) can be very low with respect to the problem complexity. A specific learning algorithm, based on the adaptation of local parameters of the activation function, is derived. This fact improve the network generalisation capabilities and speed up the convergence of the learning process. At last, some experimental results demonstrating the effectiveness of the proposed architecture, are presented. (C) 2003 Elsevier Ltd. All rights reserved.
引用
收藏
页码:247 / 260
页数:14
相关论文
共 50 条
  • [21] Using neural networks to model conditional multivariate densities
    Williams, PM
    NEURAL COMPUTATION, 1996, 8 (04) : 843 - 854
  • [22] Multiple neural-network-based adaptive controller using orthonormal activation function neural networks
    Shukla, D
    Dawson, DM
    Paul, FW
    IEEE TRANSACTIONS ON NEURAL NETWORKS, 1999, 10 (06): : 1494 - 1501
  • [23] An adaptive activation function for multilayer feedforward neural networks
    Yu, CC
    Tang, YC
    Liu, BD
    2002 IEEE REGION 10 CONFERENCE ON COMPUTERS, COMMUNICATIONS, CONTROL AND POWER ENGINEERING, VOLS I-III, PROCEEDINGS, 2002, : 645 - 650
  • [24] Artificial Neural Networks Activation Function HDL Coder
    Namin, Ashkan Hosseinzadeh
    Leboeuf, Karl
    Wu, Huapeng
    Ahmadi, Majid
    2009 IEEE INTERNATIONAL CONFERENCE ON ELECTRO/INFORMATION TECHNOLOGY, 2009, : 387 - 390
  • [25] FPGA Realization of Activation Function for Artificial Neural Networks
    Saichand, Venakata
    Nirmala, Devi M.
    Arumugam., S.
    Mohankumar, N.
    ISDA 2008: EIGHTH INTERNATIONAL CONFERENCE ON INTELLIGENT SYSTEMS DESIGN AND APPLICATIONS, VOL 3, PROCEEDINGS, 2008, : 159 - 164
  • [26] Optimizing nonlinear activation function for convolutional neural networks
    Varshney, Munender
    Singh, Pravendra
    SIGNAL IMAGE AND VIDEO PROCESSING, 2021, 15 (06) : 1323 - 1330
  • [27] Differentially Private Neural Networks with Bounded Activation Function
    Jung, Kijung
    Lee, Hyukki
    Chung, Yon Dohn
    IEICE TRANSACTIONS ON INFORMATION AND SYSTEMS, 2021, E104D (06) : 905 - 908
  • [28] Enhancement of neural networks with an alternative activation function tanhLU
    Shen, Shui-Long
    Zhang, Ning
    Zhou, Annan
    Yin, Zhen-Yu
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 199
  • [29] Stochastic Implementation of the Activation Function for Artificial Neural Networks
    Yeo, Injune
    Gi, Sang-gyun
    Lee, Byung-geun
    Chu, Myonglae
    PROCEEDINGS OF 2016 IEEE BIOMEDICAL CIRCUITS AND SYSTEMS CONFERENCE (BIOCAS), 2016, : 440 - 443
  • [30] Optimizing nonlinear activation function for convolutional neural networks
    Munender Varshney
    Pravendra Singh
    Signal, Image and Video Processing, 2021, 15 : 1323 - 1330