Discovery of Optimal Neurons and Hidden Layers in Feed-Forward Neural Network

被引:0
|
作者
Thomas, Likewin [1 ]
Kumar, Manoj M., V [1 ]
Annappa, B. [1 ]
机构
[1] NITK, Dept CSE, Mangaluru, Karnataka, India
关键词
Self-organizing neural network: cognitron; Feed-forward neural network; Neurons; Hidden layers; ADALINE Gradient decent; Master-slave model;
D O I
暂无
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Identifying the number of neurons in each hidden layers and number of hidden layers in a multi layered Artificial Neural Network (ANN) is a challenge based on the input data. A new hypothesis is proposed for organizing the synapse from x to y neuron. The synapse of number of neurons to fire between the hidden layer is identified. By the introduction of this hypothesis, an effective number of neurons in multilayered Artificial Neural Network can be identified and self organizing neural network model is developed which is referred as cognitron. The normal brain model has 3 layered perceptron; but the proposed model organizes the number of layers optimal for identifying an effective model. Our result proved that the proposed model constructs a neural model directly by identifying the optimal weights of each neurons and number of neurons in each dynamically identified hidden layers. This optimized model is self organized with different range of neurons on different layer of hidden layer, and by comparing the performance based on computational time and error at each iteration. An efficient number of neurons are organized using gradient decent. The proposed model thus train large model to perform the classification task by inserting optimal layers and neurons.
引用
收藏
页码:286 / 291
页数:6
相关论文
共 50 条
  • [21] Study of Full Interval Feed-forward Neural Network
    Guan Shou-ping
    Liang Rong-ye
    PROCEEDINGS OF THE 28TH CHINESE CONTROL AND DECISION CONFERENCE (2016 CCDC), 2016, : 2652 - 2655
  • [22] A Feed-Forward Neural Network for Solving Stokes Problem
    Baymani, M.
    Effati, S.
    Kerayechian, A.
    ACTA APPLICANDAE MATHEMATICAE, 2011, 116 (01) : 55 - 64
  • [23] A Feed-Forward Neural Network for Solving Stokes Problem
    M. Baymani
    S. Effati
    A. Kerayechian
    Acta Applicandae Mathematicae, 2011, 116
  • [24] A modified hidden weight optimization algorithm for feed-forward neural networks
    Yu, CH
    Manry, MT
    THIRTY-SIXTH ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS - CONFERENCE RECORD, VOLS 1 AND 2, CONFERENCE RECORD, 2002, : 1034 - 1038
  • [25] Feed-forward neural networks
    Bebis, George
    Georgiopoulos, Michael
    IEEE Potentials, 1994, 13 (04): : 27 - 31
  • [26] A method for determining the hidden neuron number of the feed-forward neural network based on fuzzy cluster analysis
    Luo, XQ
    Shang, CX
    ISTM/2003: 5TH INTERNATIONAL SYMPOSIUM ON TEST AND MEASUREMENT, VOLS 1-6, CONFERENCE PROCEEDINGS, 2003, : 1461 - 1464
  • [27] Optimal Output Gain Algorithm for Feed-Forward Network Training
    Aswathappa, Babu Hemanth Kumar
    Manry, M. T.
    Rawat, Rohit
    2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 2609 - 2616
  • [28] Feed-Forward Network Training Using Optimal Input Gains
    Malalur, Sanjeev S.
    Manry, Michael, Sr.
    IJCNN: 2009 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1- 6, 2009, : 2326 - 2333
  • [29] NORMALIZED DATA BARRIER AMPLIFIER FOR FEED-FORWARD NEURAL NETWORK
    Fuangkhon, P.
    NEURAL NETWORK WORLD, 2021, 31 (02) : 125 - 157
  • [30] Loss Surface Modality of Feed-Forward Neural Network Architectures
    Bosman, Anna Sergeevna
    Engelbrecht, Andries Petrus
    Helbig, Marcie
    2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,