Regularization theory in the study of generalization ability of a biological neural network model

被引:0
|
作者
Aleksandra Świetlicka
机构
[1] Poznan University of Technology,Institute of Automation and Robotics
来源
关键词
Kinetic model of neuron; Markov kinetic schemes; Lagrange multipliers; Generalization ability; Image processing; Noise reduction; 68T05;
D O I
暂无
中图分类号
学科分类号
摘要
This paper focuses on the generalization ability of a dendritic neuron model (a model of a simple neural network). The considered model is an extension of the Hodgkin-Huxley model. The Markov kinetic schemes have been used in the mathematical description of the model, while the Lagrange multipliers method has been applied to train the model. The generalization ability of the model is studied using a method known from the regularization theory, in which a regularizer is added to the neural network error function. The regularizers in the form of the sum of squared weights of the model (the penalty function), a linear differential operator related to the input-output mapping (the Tikhonov functional), and the square norm of the network curvature are applied in the study. The influence of the regularizers on the training process and its results are illustrated with the problem of noise reduction in images of electronic components. Several metrics are used to compare results obtained for different regularizers.
引用
收藏
页码:1793 / 1805
页数:12
相关论文
共 50 条
  • [1] Regularization theory in the study of generalization ability of a biological neural network model
    Swietlicka, Aleksandra
    ADVANCES IN COMPUTATIONAL MATHEMATICS, 2019, 45 (04) : 1793 - 1805
  • [2] PROJECTED WEIGHT REGULARIZATION TO IMPROVE NEURAL NETWORK GENERALIZATION
    Zhang, Guoqiang
    Niwa, Kenta
    Kleijn, W. Bastiaan
    2020 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2020, : 4242 - 4246
  • [3] ON THE GENERALIZATION ABILITY OF NEURAL-NETWORK CLASSIFIERS
    MUSAVI, MT
    CHAN, KH
    HUMMELS, DM
    KALANTRI, K
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 1994, 16 (06) : 659 - 663
  • [4] DropAll: Generalization of Two Convolutional Neural Network Regularization Methods
    Frazao, Xavier
    Alexandre, Luis A.
    IMAGE ANALYSIS AND RECOGNITION, ICIAR 2014, PT I, 2014, 8814 : 282 - 289
  • [5] A new regularization learning method for improving generalization capability of neural network
    Wu, Y
    Zhang, LM
    PROCEEDINGS OF THE 4TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-4, 2002, : 2011 - 2015
  • [6] Study and application of a class of neural networks model whih better generalization ability
    Wang, YC
    Wu, HX
    Geng, CF
    PROCEEDINGS OF THE 4TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-4, 2002, : 2016 - 2020
  • [7] Research on Improve the Generalization Ability of Neural Network Ensemble System
    Ma, Ning
    Liao, Huihui
    Li, Bin
    SENSORS, MEASUREMENT AND INTELLIGENT MATERIALS, PTS 1-4, 2013, 303-306 : 1444 - +
  • [8] Establishing quantitative structure tribo-ability relationship model using Bayesian regularization neural network
    Xinlei Gao
    Kang Dai
    Zhan Wang
    Tingting Wang
    Junbo He
    Friction, 2016, 4 : 105 - 115
  • [9] Establishing quantitative structure tribo-ability relationship model using Bayesian regularization neural network
    Gao, Xinlei
    Dai, Kang
    Wang, Zhan
    Wang, Tingting
    He, Junbo
    FRICTION, 2016, 4 (02) : 105 - 115
  • [10] Bayesian model comparison versus generalization ability of neural networks
    Gomari, M
    Järvi, T
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL I AND II, 1999, : 537 - 541