Feed-forward neural networks

被引:346
|
作者
Bebis, George [1 ]
Georgiopoulos, Michael [1 ]
机构
[1] Electrical and Computer Engineering Department, University of Central Florida, United States
来源
IEEE Potentials | 1994年 / 13卷 / 04期
关键词
Algorithms - Approximation theory - Computational complexity - Computer architecture - Correlation methods - Curve fitting - Errors - Learning systems - Optimization - Polynomials - Sensitivity analysis;
D O I
10.1109/45.329294
中图分类号
学科分类号
摘要
The paper emphasizes the importance of network size for a given application. Network size affects network complexity, learning time and generalization capabilities of the network. Included is an illustrative analogy between neural network learning and curve fitting. In the determination of hidden nodes and hidden layers it was found out that feed-forward networks can approximate virtually any function of interest to any desired degree of accuracy, provided enough hidden units are available. Small networks capable of learning the task is better for practical and theoretical reasons as compared to bigger networks. The generalization capabilities of a network can be improved by modifying the connection weights and architecture. These are specifically the pruning and constructure approaches.
引用
收藏
页码:27 / 31
相关论文
共 50 条
  • [41] Classification of urinary calculi using feed-forward neural networks
    Kuzmanovski, I
    Zdravkova, K
    Trpkovska, M
    SOUTH AFRICAN JOURNAL OF CHEMISTRY-SUID-AFRIKAANSE TYDSKRIF VIR CHEMIE, 2006, 59 : 12 - 16
  • [42] Dynamic hysteresis modelling using feed-forward neural networks
    Makaveev, D
    Dupré, L
    De Wulf, M
    Melkebeek, J
    JOURNAL OF MAGNETISM AND MAGNETIC MATERIALS, 2003, 254 : 256 - 258
  • [43] Intelligent process modelling using Feed-Forward Neural Networks
    Gadallah M.H.
    Hamid El-Sayed K.A.
    Hekman K.
    International Journal of Manufacturing Technology and Management, 2010, 19 (3-4) : 238 - 257
  • [44] Training Algorithm with Incomplete Data for Feed-Forward Neural Networks
    Song-Yee Yoon
    Soo-Young Lee
    Neural Processing Letters, 1999, 10 : 171 - 179
  • [45] Universal approximation of fully complex feed-forward neural networks
    Kim, T
    Adali, H
    2002 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, VOLS I-IV, PROCEEDINGS, 2002, : 973 - 976
  • [46] Probabilistic and statistical aspects of feed-forward nonlinear neural networks
    Koshimizu, T
    Tsujitani, M
    ICONIP'98: THE FIFTH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING JOINTLY WITH JNNS'98: THE 1998 ANNUAL CONFERENCE OF THE JAPANESE NEURAL NETWORK SOCIETY - PROCEEDINGS, VOLS 1-3, 1998, : 490 - 493
  • [47] A novel activation function for multilayer feed-forward neural networks
    Njikam, Aboubakar Nasser Samatin
    Zhao, Huan
    APPLIED INTELLIGENCE, 2016, 45 (01) : 75 - 82
  • [48] Differential evolution training algorithm for feed-forward neural networks
    Ilonen, J
    Kamarainen, JK
    Lampinen, J
    NEURAL PROCESSING LETTERS, 2003, 17 (01) : 93 - 105
  • [49] Parallelizable Reachability Analysis Algorithms for Feed-Forward Neural Networks
    Tran, Hoang-Dung
    Musau, Patrick
    Lopez, Diego Manzanas
    Yang, Xiaodong
    Nguyen, Luan Viet
    Xiang, Weiming
    Johnson, Taylor T.
    2019 IEEE/ACM 7TH INTERNATIONAL WORKSHOP ON FORMAL METHODS IN SOFTWARE ENGINEERING (FORMALISE 2019), 2019, : 31 - 40
  • [50] Feed-forward neural networks as a mixed-integer program
    Aftabi, Navid
    Moradi, Nima
    Mahroo, Fatemeh
    ENGINEERING WITH COMPUTERS, 2025,