A neural network incorporating coupled gradient descent method

被引:0
|
作者
Isokawa, T [1 ]
Matsui, N [1 ]
Peper, F [1 ]
机构
[1] Himeji Inst Technol, Dept Comp Engn, Himeji, Hyogo 6712201, Japan
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Back Propagation (BP) algorithm for learning method of layered neural network is widely used in many areas of applications. However, BP has several important drawbacks. One of them is "standstill in learning", and it means that the state of network is subject to getting trapped in one of local minima. In this work, we present the Coupled Gradient Descent Method (CGDM) as an efficient way of avoiding local minima in the state of network.
引用
收藏
页码:162 / 165
页数:4
相关论文
共 50 条
  • [1] An analysis of learning algorithm for layered neural network with coupled gradient descent method
    Isokawa, T
    Matsui, N
    [J]. 8TH INTERNATIONAL CONFERENCE ON NEURAL INFORMATION PROCESSING, VOLS 1-3, PROCEEDING, 2001, : 751 - 754
  • [2] Universality of gradient descent neural network training
    Welper, G.
    [J]. NEURAL NETWORKS, 2022, 150 : 259 - 273
  • [3] A gradient descent method for a neural fractal memory
    Melnik, O
    Pollack, J
    [J]. IEEE WORLD CONGRESS ON COMPUTATIONAL INTELLIGENCE, 1998, : 1069 - 1073
  • [4] An approximate gradient descent algorithm for Spiking Neural Network
    Chen, Wenjie
    Li, Chuandong
    [J]. 2023 35TH CHINESE CONTROL AND DECISION CONFERENCE, CCDC, 2023, : 4690 - 4694
  • [5] Convergence of Stochastic Gradient Descent in Deep Neural Network
    Bai-cun Zhou
    Cong-ying Han
    Tian-de Guo
    [J]. Acta Mathematicae Applicatae Sinica, English Series, 2021, 37 : 126 - 136
  • [6] Convergence of Stochastic Gradient Descent in Deep Neural Network
    Zhou, Bai-cun
    Han, Cong-ying
    Guo, Tian-de
    [J]. ACTA MATHEMATICAE APPLICATAE SINICA-ENGLISH SERIES, 2021, 37 (01): : 126 - 136
  • [7] Convergence of Stochastic Gradient Descent in Deep Neural Network
    Bai-cun ZHOU
    Cong-ying HAN
    Tian-de GUO
    [J]. Acta Mathematicae Applicatae Sinica, 2021, 37 (01) : 126 - 136
  • [8] Fractional Gradient Descent Method for Spiking Neural Networks
    Yang, Honggang
    Chen, Jiejie
    Jiang, Ping
    Xu, Mengfei
    Zhao, Haiming
    [J]. 2023 2ND CONFERENCE ON FULLY ACTUATED SYSTEM THEORY AND APPLICATIONS, CFASTA, 2023, : 636 - 641
  • [9] STADIA: Photonic Stochastic Gradient Descent for Neural Network Accelerators
    Xia, Chengpeng
    Chen, Yawen
    Zhang, Haibo
    Wu, Jigang
    [J]. ACM TRANSACTIONS ON EMBEDDED COMPUTING SYSTEMS, 2023, 22 (05)
  • [10] A Fractional Gradient Descent-Based RBF Neural Network
    Khan, Shujaat
    Naseem, Imran
    Malik, Muhammad Ammar
    Togneri, Roberto
    Bennamoun, Mohammed
    [J]. CIRCUITS SYSTEMS AND SIGNAL PROCESSING, 2018, 37 (12) : 5311 - 5332