Search for the Global Extremum Using the Correlation Indicator for Neural Networks Supervised Learning

被引:2
|
作者
Vershkov, N. [1 ]
Babenko, M. [1 ]
Kuchukov, V. [1 ]
Kuchukova, N. [1 ]
机构
[1] North Caucasus Fed Univ, Stavropol 355029, Russia
关键词
D O I
10.1134/S0361768820080265
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
The article discusses the search for a global extremum in the training of artificial neural networks using a correlation indicator. A method based on a mathematical model of an artificial neural network presented as an information transmission system is proposed. Drawing attention to the fact that in information transmission systems widely used methods that allow effective analysis and recovery of useful signal against the background of various interferences: Gaussian, concentrated, pulsed, etc., it is possible to make an assumption about the effectiveness of the mathematical model of artificial neural network, presented as a system of information transmission. The article analyzes the convergence of training and experimentally obtained sequences based on a correlation indicator for fully-connected neural network. The possibility of estimating the convergence of the training and experimentally obtained sequences based on the joint correlation function as a measure of their energy similarity (difference) is confirmed. To evaluate the proposed method, a comparative analysis is made with the currently used indicators. The potential sources of errors in the least-squares method and the possibilities of the proposed indicator to overcome them are investigated. Simulation of the learning process of an artificial neural network has shown that the use of the joint correlation function together with the Adadelta optimizer allows us to get again in learning speed 2-3 times compared to CrossEntropyLoss.
引用
收藏
页码:609 / 618
页数:10
相关论文
共 50 条
  • [1] Search for the Global Extremum Using the Correlation Indicator for Neural Networks Supervised Learning
    N. Vershkov
    M. Babenko
    V. Kuchukov
    N. Kuchukova
    [J]. Programming and Computer Software, 2020, 46 : 609 - 618
  • [2] Procedure neural networks with supervised learning
    Liang, JZ
    Zhou, JQ
    He, XG
    [J]. ICONIP'02: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING: COMPUTATIONAL INTELLIGENCE FOR THE E-AGE, 2002, : 523 - 527
  • [3] On the global optimal theory of the outer-supervised learning feedforward neural networks
    Beijing Inst of System Engineering, Beijing, China
    [J]. Tien Tzu Hsueh Pao, 4 (98-101):
  • [4] Supervised Learning Probabilistic Neural Networks
    I-Cheng Yeh
    Kuan-Cheng Lin
    [J]. Neural Processing Letters, 2011, 34 : 193 - 208
  • [5] Supervised Learning Probabilistic Neural Networks
    Yeh, I-Cheng
    Lin, Kuan-Cheng
    [J]. NEURAL PROCESSING LETTERS, 2011, 34 (02) : 193 - 208
  • [6] Supervised learning with spiking neural networks
    Xin, JG
    Embrechts, MJ
    [J]. IJCNN'01: INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2001, : 1772 - 1777
  • [7] Supervised learning of process discovery techniques using graph neural networks
    Sommers, Dominique
    Menkovski, Vlado
    Fahland, Dirk
    [J]. INFORMATION SYSTEMS, 2023, 115
  • [8] Supervised Learning in Football Game Environments Using Artificial Neural Networks
    Baykal, Omer
    Alpaslan, Ferda Nur
    [J]. 2018 3RD INTERNATIONAL CONFERENCE ON COMPUTER SCIENCE AND ENGINEERING (UBMK), 2018, : 110 - 115
  • [9] An algorithm of supervised learning for multilayer neural networks
    Tang, Z
    Wang, XG
    Tamura, H
    Ishii, M
    [J]. NEURAL COMPUTATION, 2003, 15 (05) : 1125 - 1142
  • [10] A review of online learning in supervised neural networks
    Lakhmi C. Jain
    Manjeevan Seera
    Chee Peng Lim
    P. Balasubramaniam
    [J]. Neural Computing and Applications, 2014, 25 : 491 - 509