Convolutional neural networks based on fractional-order momentum for parameter training

被引:10
|
作者
Kan, Tao [1 ]
Gao, Zhe [1 ,2 ]
Yang, Chuang [1 ]
Jian, Jing [1 ]
机构
[1] Liaoning Univ, Sch Math, Shenyang 110036, Peoples R China
[2] Liaoning Univ, Coll Light Ind, Shenyang 110036, Peoples R China
关键词
Convolutional neural networks; Fractional-order difference; Momentum; MNIST; CIFAR-10; RECOGNITION; STABILITY; DISCRETE; TERM;
D O I
10.1016/j.neucom.2021.03.075
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper proposes a parameter training method via the fractional-order momentum for convolutional neural networks (CNNs). To update the parameters of CNNs more smoothly, the parameter training method via the fractional-order momentum is proposed based on the Gr & uuml;nwald-Letnikov (G-L) difference operation. The stochastic classical momentum (SCM) algorithm and adaptive moment (Adam) estimation algorithm are improved by replacing the integer-order difference with the fractional-order difference. Meanwhile, the linear and the nonlinear methods are discussed to adjust the fractional-order. Therefore, the proposed methods can improve the flexibility and the adaptive ability of CNN parameters. We analyze the validity of the methods by using MNIST dataset and CIFAR-10 dataset, and the experimental results show that the proposed methods can improve the recognition accuracy and the learning convergence speed of CNNs compared with the traditional SCM and Adam methods. (c) 2021 Elsevier B.V. All rights reserved.
引用
收藏
页码:85 / 99
页数:15
相关论文
共 50 条
  • [1] Synchronization-based parameter estimation of fractional-order neural networks
    Gu, Yajuan
    Yu, Yongguang
    Wang, Hu
    [J]. PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2017, 483 : 351 - 361
  • [2] A fractional-order momentum optimization approach of deep neural networks
    Yu, ZhongLiang
    Sun, Guanghui
    Lv, Jianfeng
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (09): : 7091 - 7111
  • [3] A fractional-order momentum optimization approach of deep neural networks
    ZhongLiang Yu
    Guanghui Sun
    Jianfeng Lv
    [J]. Neural Computing and Applications, 2022, 34 : 7091 - 7111
  • [4] A fractional-order momentum optimization approach of deep neural networks
    Yu, ZhongLiang
    Sun, Guanghui
    Lv, Jianfeng
    [J]. Neural Computing and Applications, 2022, 34 (09) : 7091 - 7111
  • [5] Fractional-order convolutional neural networks with population extremal optimization
    Chen, Bi-Peng
    Chen, Yun
    Zeng, Guo-Qiang
    She, Qingshan
    [J]. NEUROCOMPUTING, 2022, 477 : 36 - 45
  • [6] Stochastic Gradient Descent Method of Convolutional Neural Network Using Fractional-Order Momentum
    Kan, Tao
    Gao, Zhe
    Yang, Chuang
    [J]. Moshi Shibie yu Rengong Zhineng/Pattern Recognition and Artificial Intelligence, 2020, 33 (06): : 559 - 567
  • [7] Fractional-Order Hopfield Neural Networks
    Boroomand, Arefeh
    Menhaj, Mohammad B.
    [J]. ADVANCES IN NEURO-INFORMATION PROCESSING, PT I, 2009, 5506 : 883 - 890
  • [8] A SURVEY OF FRACTIONAL-ORDER NEURAL NETWORKS
    Zhang, Shuo
    Chen, YangQuan
    Yu, Yongguang
    [J]. PROCEEDINGS OF THE ASME INTERNATIONAL DESIGN ENGINEERING TECHNICAL CONFERENCES AND COMPUTERS AND INFORMATION IN ENGINEERING CONFERENCE, 2017, VOL 9, 2017,
  • [9] Dynamics of fractional-order neural networks
    Kaslik, Eva
    Sivasundaram, Seenith
    [J]. 2011 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2011, : 611 - 618
  • [10] Dynamics in fractional-order neural networks
    Song, Chao
    Cao, Jinde
    [J]. NEUROCOMPUTING, 2014, 142 : 494 - 498