COMPUTATIONAL STUDIES WITH EQUIVALENT DEGREES OF FREEDOMS IN NEURAL NETWORKS

被引:0
|
作者
Ingrassia, Salvatore [1 ]
Morlini, Isabella [2 ]
机构
[1] Univ Catania, Dipartimento Econ Metodi Quantitativi, Corso Italia 55, I-95128 Catania, Italy
[2] Univ Modena & Reggio Emilia, Dipartimento Sci Social Cognit Quantitat, I-42100 Reggio Emilia, Italy
关键词
neural models; nested test; confidence intervals; equivalent degrees of freedom; Monte Carlo study;
D O I
暂无
中图分类号
O21 [概率论与数理统计]; C8 [统计学];
学科分类号
020208 ; 070103 ; 0714 ;
摘要
The notion of equivalent number of degrees of freedom (e. d. f.) has been recently proposed in the context of neural network modeling for small data sets. This quantity is much smaller than the number of the parameters in the network and it does not depend on the number of input variables. In this paper, we present numerical studies on both real and simulated data sets assuring the validity of e. d. f. in a general framework. Results confirm that e. d. f. performs more reliably than the total number W of adaptive parameters-which are usually assumed equal to the degrees of freedom of the model in common statistical softwares-for analyzing and comparing neural models. Numerical studies also point out that e. d. f. works well in estimating the error variance and constructing approximate confidence intervals. We then propose a comparison among some model selection criteria and results show that for neural networks GCV performs slightly better. We finally present a simple forward procedure which can be easily implemented for automatically selecting a neural model with good trade-off between learning error and generalization properties.
引用
收藏
页码:49 / 81
页数:33
相关论文
共 50 条
  • [21] Fuzzy logic systems are equivalent to feedforward neural networks
    Hongxing Li
    [J]. Science in China Series E: Technological Sciences, 2000, 43 : 42 - 54
  • [22] Fuzzy logic systems are equivalent to feedforward neural networks
    李洪兴
    [J]. Science China Technological Sciences, 2000, (01) : 42 - 54
  • [23] Fuzzy logic systems are equivalent to feedforward neural networks
    Li, HX
    [J]. SCIENCE IN CHINA SERIES E-TECHNOLOGICAL SCIENCES, 2000, 43 (01): : 42 - 54
  • [24] Computational neurogenetic modelling: Gene networks within neural networks
    Kasabov, N
    Benuskova, L
    Wysoski, SG
    [J]. 2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 1203 - 1208
  • [25] Spiking Neural Networks in Spintronic Computational RAM
    Cilasun, Husrev
    Resch, Salonik
    Chowdhury, Zamshed, I
    Olson, Erin
    Zabihi, Masoud
    Zhao, Zhengyang
    Peterson, Thomas
    Parhi, Keshab K.
    Wang, Jian-Ping
    Sapatnekar, Sachin S.
    Karpuzcu, Ulya R.
    [J]. ACM TRANSACTIONS ON ARCHITECTURE AND CODE OPTIMIZATION, 2021, 18 (04)
  • [26] Design of neural networks for solving computational problems
    ElBakry, HM
    AboElsoud, MA
    Soliman, HH
    ElMikati, HA
    [J]. THIRTEENTH NATIONAL RADIO SCIENCE CONFERENCE - NRSC'96, 1996, : 281 - 288
  • [27] Spiking Neural Networks for Computational Intelligence: An Overview
    Dora, Shirin
    Kasabov, Nikola
    [J]. BIG DATA AND COGNITIVE COMPUTING, 2021, 5 (04)
  • [28] ON THE COMPUTATIONAL-EFFICIENCY OF SYMMETRICAL NEURAL NETWORKS
    WIEDERMANN, J
    [J]. THEORETICAL COMPUTER SCIENCE, 1991, 80 (02) : 337 - 345
  • [29] The Computational Power of Interactive Recurrent Neural Networks
    Cabessa, Jeremie
    Siegelmann, Hava T.
    [J]. NEURAL COMPUTATION, 2012, 24 (04) : 996 - 1019
  • [30] Computational inference of neural information flow networks
    Smith, V. Anne
    Yu, Jing
    Smulders, Tom V.
    Hartemink, Alexander J.
    Jarvis, Erich D.
    [J]. PLOS COMPUTATIONAL BIOLOGY, 2006, 2 (11) : 1436 - 1449