Gradient descent learning of radial-basis neural networks

被引:0
|
作者
Karayiannis, NB
机构
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper present's an axiomatic approach for building RBF neural networks and also proposes a supervised learning algorithm based on gradient descent for their training. This approach results in a broad variety of admissible RBF models, including those employing Gaussian radial basis functions. The form of the radial basis functions is determined by a generator function. A sensitivity analysis explains the failure of gradient descent learning on RBF net works with Gaussian radial basis functions, which are generated by an exponential generator function. The same analysis verifies that RBF networks generated by a linear generator function are much more suitable for gradient descent learning. Experiments involving such RBF networks indicate that the proposed gradient descent algorithm guarantees fast learning and very satisfactory generalization ability.
引用
收藏
页码:1815 / 1820
页数:6
相关论文
共 50 条
  • [31] Learning dynamics of gradient descent optimization in deep neural networks
    Wei WU
    Xiaoyuan JING
    Wencai DU
    Guoliang CHEN
    Science China(Information Sciences), 2021, 64 (05) : 17 - 31
  • [32] Relaxed conditions for radial-basis function networks to be universal approximators
    Liao, Y
    Fang, SC
    Nuttle, HLW
    NEURAL NETWORKS, 2003, 16 (07) : 1019 - 1028
  • [33] Learning algorithms for reformulated radial basis neural networks
    Karayiannis, NB
    IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 2230 - 2235
  • [34] Wide-band dynamic modeling of power amplifiers using radial-basis function neural networks
    Isaksson, M
    Wisell, D
    Rönnow, D
    IEEE TRANSACTIONS ON MICROWAVE THEORY AND TECHNIQUES, 2005, 53 (11) : 3422 - 3428
  • [35] Prediction of Medical Examination Results Using Radial-Basis Function Networks
    Jang, Gil-Jin
    Kim, Minho
    Kim, Young-Won
    Choi, Jaehun
    2016 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS-ASIA (ICCE-ASIA), 2016,
  • [36] Gradient Descent for Spiking Neural Networks
    Huh, Dongsung
    Sejnowski, Terrence J.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 31 (NIPS 2018), 2018, 31
  • [37] INVERSION OF NEURAL NETWORKS BY GRADIENT DESCENT
    KINDERMANN, J
    LINDEN, A
    PARALLEL COMPUTING, 1990, 14 (03) : 277 - 286
  • [38] Learning Errors by Radial Basis Function Neural Networks and Regularization Networks
    Neruda, Roman
    Vidnerova, Petra
    INTERNATIONAL JOURNAL OF GRID AND DISTRIBUTED COMPUTING, 2009, 2 (01): : 49 - 57
  • [39] Learning errors by radial basis function neural networks and regularization networks
    Institute of Computer Science, Academy of Sciences of the Czech Republic, Pod vodárenskou věžíí 2, Prague 8, Czech Republic
    Int. J. Grid Distrib. Comput., 2009, 1 (49-58):
  • [40] Dynamics of on-line gradient descent learning for multilayer neural networks
    Saad, D
    Solla, SA
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 8: PROCEEDINGS OF THE 1995 CONFERENCE, 1996, 8 : 302 - 308