Effects of the polarity of neural network units on backpropagation learning

被引:1
|
作者
Gotanda, H
Ueda, Y
Kawasaki, T
机构
[1] Faculty of Engineering, Kinki University in Kyushu, Iizuka
[2] Univ. of Cal., Berkeley, CA
[3] IEEE, Soc. Instr. Contr. Eng.
关键词
multilayered network; error backpropagation; polarity of unit; convergence;
D O I
10.1002/scj.4690271407
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
This paper considers the neural network in which the initial values for the weights and the bias are given by random numbers as in usual cases. The results of BP learning in networks composed of unipolar units having an activity range from 0 to 1 and networks with bipolar units with a range from -0.5 to 0.5 are compared. When the input space is large, the separation hyperplane at the outset of learning passes near the center of the input space in the bipolar case, while that in the unipolar case passes near the vertex. Because of this property, the number of separation hyperplanes that effectively separate the input spaces of the layers during the updating or realization of the solution is larger in the bipolar case than in the unipolar case. The difference between the two becomes more remarkable with the increase of size. As a result of simulation, it is verified that the learning by the bipolar network gives better convergence for a wider range of initial values than the learning by the unipolar network when the network is large. It is shown also that the kinds of solution obtained by the unipolar network tend to be deviated.
引用
收藏
页码:55 / 67
页数:13
相关论文
共 50 条
  • [31] Signature Recognition Using Backpropagation Neural Network
    Inan, Yucel
    Sekeroglu, Boran
    13TH INTERNATIONAL CONFERENCE ON THEORY AND APPLICATION OF FUZZY SYSTEMS AND SOFT COMPUTING - ICAFS-2018, 2019, 896 : 256 - 261
  • [32] Character recognition by double backpropagation neural network
    Kamruzzaman, J
    Kumagai, Y
    Aziz, SM
    IEEE TENCON'97 - IEEE REGIONAL 10 ANNUAL CONFERENCE, PROCEEDINGS, VOLS 1 AND 2: SPEECH AND IMAGE TECHNOLOGIES FOR COMPUTING AND TELECOMMUNICATIONS, 1997, : 411 - 414
  • [33] Neural Network DPD via Backpropagation through a Neural Network Model of the PA
    Tarver, Chance
    Jiang, Liwen
    Sefidi, Aryan
    Cavallaro, Joseph R.
    CONFERENCE RECORD OF THE 2019 FIFTY-THIRD ASILOMAR CONFERENCE ON SIGNALS, SYSTEMS & COMPUTERS, 2019, : 358 - 362
  • [34] Improved backpropagation learning rates in neural nets
    Sanossian, HYY
    Evans, DJ
    INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 1997, 63 (1-2) : 1 - 7
  • [35] CONSTRAINED LEARNING ALGORITHMS FOR BACKPROPAGATION NEURAL NETWORKS
    HARRINGTON, PD
    ABSTRACTS OF PAPERS OF THE AMERICAN CHEMICAL SOCIETY, 1993, 205 : 32 - COMP
  • [36] Neural networks with robust backpropagation learning algorithm
    Walczak, B
    ANALYTICA CHIMICA ACTA, 1996, 322 (1-2) : 21 - 29
  • [37] Quantum Neural Machine Learning: Backpropagation and Dynamics
    Goncalves, Carlos Pedro
    NEUROQUANTOLOGY, 2017, 15 (01) : 22 - 41
  • [38] Fuzzy assisted learning in backpropagation neural networks
    H. O. Nyongesa
    Neural Computing & Applications, 1997, 6 : 238 - 244
  • [39] Fuzzy assisted learning in backpropagation neural networks
    Nyongesa, HO
    NEURAL COMPUTING & APPLICATIONS, 1997, 6 (04): : 238 - 244
  • [40] The Effects of GABAergic Polarity Changes on Episodic Neural Network Activity in Developing Neural Systems
    Blanco, Wilfredo
    Bertram, Richard
    Tabak, Joel
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2017, 11