Regularization theory in the study of generalization ability of a biological neural network model

被引:0
|
作者
Aleksandra Świetlicka
机构
[1] Poznan University of Technology,Institute of Automation and Robotics
来源
关键词
Kinetic model of neuron; Markov kinetic schemes; Lagrange multipliers; Generalization ability; Image processing; Noise reduction; 68T05;
D O I
暂无
中图分类号
学科分类号
摘要
This paper focuses on the generalization ability of a dendritic neuron model (a model of a simple neural network). The considered model is an extension of the Hodgkin-Huxley model. The Markov kinetic schemes have been used in the mathematical description of the model, while the Lagrange multipliers method has been applied to train the model. The generalization ability of the model is studied using a method known from the regularization theory, in which a regularizer is added to the neural network error function. The regularizers in the form of the sum of squared weights of the model (the penalty function), a linear differential operator related to the input-output mapping (the Tikhonov functional), and the square norm of the network curvature are applied in the study. The influence of the regularizers on the training process and its results are illustrated with the problem of noise reduction in images of electronic components. Several metrics are used to compare results obtained for different regularizers.
引用
收藏
页码:1793 / 1805
页数:12
相关论文
共 50 条
  • [21] Regularization of deep neural network using a multisample memory model
    Muhammad Tanveer
    Mohammad Yakoob Siyal
    Sheikh Faisal Rashid
    Neural Computing and Applications, 2024, 36 (36) : 23295 - 23307
  • [22] Generalization theory and generalization methods for neural networks
    Wei, Hai-Kun
    Xu, Si-Xin
    Song, Wen-Zhong
    Zidonghua Xuebao/Acta Automatica Sinica, 2001, 27 (06): : 806 - 815
  • [23] A study on generalization ability of 3-layer recurrent neural networks
    Ninomiya, H
    Sasaki, A
    PROCEEDING OF THE 2002 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-3, 2002, : 1063 - 1068
  • [24] GENERALIZATION IN AN ANALOG NEURAL NETWORK
    STARIOLO, DA
    TAMARIT, FA
    PHYSICAL REVIEW A, 1992, 46 (08): : 5249 - 5252
  • [25] A Comparative Study on the Generalization Ability of back Propagation Neural Network and Support Vector Machine for Tracking Tumor Motion in Radiotherapy
    Shan, Guoping
    Zhang, Jie
    Ge, Yun
    Chen, Ming
    2018 2ND INTERNATIONAL CONFERENCE ON BIOMEDICAL ENGINEERING AND BIOINFORMATICS (ICBEB 2018), 2018, : 85 - 88
  • [26] Empirical estimation of generalization ability of neural networks
    Sarkar, D
    APPLICATIONS AND SCIENCE OF ARTIFICIAL NEURAL NETWORKS II, 1996, 2760 : 54 - 60
  • [27] ON THE ABILITY OF NEURAL NETWORKS TO PERFORM GENERALIZATION BY INDUCTION
    ANSHELEVICH, VV
    AMIRIKIAN, BR
    LUKASHIN, AV
    FRANKKAMENETSKII, MD
    BIOLOGICAL CYBERNETICS, 1989, 61 (02) : 125 - 128
  • [28] Empirical estimation of generalization ability of neural networks
    Sarkar, D.
    Neural Network World, 2001, 11 (01) : 3 - 15
  • [29] Bayesian regularization neural network model for stock time series prediction
    Hou Y.
    Xie B.
    Liu H.
    International Journal of Performability Engineering, 2019, 15 (12): : 3271 - 3278
  • [30] A Bayesian regularization-backpropagation neural network model for peeling computations
    Gouravaraju, Saipraneeth
    Narayan, Jyotindra
    Sauer, Roger A.
    Gautam, Sachin Singh
    JOURNAL OF ADHESION, 2023, 99 (01): : 92 - 115