A New Improved Learning Algorithm for Convolutional Neural Networks

被引:4
|
作者
Yang, Jie [1 ]
Zhao, Junhong [1 ]
Lu, Lu [1 ]
Pan, Tingting [1 ]
Jubair, Sidra [1 ]
机构
[1] Dalian Univ Technol, Sch Math Sci, Dalian 116024, Peoples R China
基金
中国国家自然科学基金;
关键词
convolutional neural networks; loss function; MNIST; CIFAR-10;
D O I
10.3390/pr8030295
中图分类号
TQ [化学工业];
学科分类号
0817 ;
摘要
The back-propagation (BP) algorithm is usually used to train convolutional neural networks (CNNs) and has made greater progress in image classification. It updates weights with the gradient descent, and the farther the sample is from the target, the greater the contribution of it to the weight change. However, the influence of samples classified correctly but that are close to the classification boundary is diminished. This paper defines the classification confidence as the degree to which a sample belongs to its correct category, and divides samples of each category into dangerous and safe according to a dynamic classification confidence threshold. Then a new learning algorithm is presented to penalize the loss function with danger samples but not all samples to enable CNN to pay more attention to danger samples and to learn effective information more accurately. The experiment results, carried out on the MNIST dataset and three sub-datasets of CIFAR-10, showed that for the MNIST dataset, the accuracy of Non-improve CNN reached 99.246%, while that of PCNN reached 99.3%; for three sub-datasets of CIFAR-10, the accuracies of Non-improve CNN are 96.15%, 88.93%, and 94.92%, respectively, while those of PCNN are 96.44%, 89.37%, and 95.22%, respectively.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] A learning algorithm for improved recurrent neural networks
    Chen, CH
    Yu, LW
    [J]. 1997 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, 1997, : 2198 - 2202
  • [2] New learning algorithm for neural networks
    Wang, Xuefeng
    Feng, Yingjun
    [J]. Harbin Gongye Daxue Xuebao/Journal of Harbin Institute of Technology, 1997, 29 (02): : 23 - 25
  • [3] APPLICATION OF AN IMPROVED GENETIC ALGORITHM TO THE LEARNING OF NEURAL NETWORKS
    IKUNO, Y
    KAWABATA, H
    SHIRAO, Y
    HIRATA, M
    NAGAHARA, T
    INAGAKI, Y
    [J]. IEICE TRANSACTIONS ON FUNDAMENTALS OF ELECTRONICS COMMUNICATIONS AND COMPUTER SCIENCES, 1994, E77A (04) : 731 - 735
  • [4] AutoTune: Automatically Tuning Convolutional Neural Networks for Improved Transfer Learning
    Basha, S. H. Shabbeer
    Vinakota, Sravan Kumar
    Pulabaigari, Viswanath
    Mukherjee, Snehasis
    Dubey, Shiv Ram
    [J]. NEURAL NETWORKS, 2021, 133 : 112 - 122
  • [5] REGP: A NEW POOLING ALGORITHM FOR DEEP CONVOLUTIONAL NEURAL NETWORKS
    Yildirim, O.
    Baloglu, U. B.
    [J]. NEURAL NETWORK WORLD, 2019, 29 (01) : 45 - 60
  • [6] A new recognition algorithm for high-voltage lines based on improved LSD and convolutional neural networks
    Luo, Yanhong
    Yu, Xue
    Yang, Dongsheng
    [J]. IET IMAGE PROCESSING, 2021, 15 (01) : 260 - 268
  • [7] A New Learning Algorithm for RBF Neural Networks
    Man Chun-tao
    Yang Xu
    Zhang Li-yong
    [J]. 2008 2ND INTERNATIONAL SYMPOSIUM ON SYSTEMS AND CONTROL IN AEROSPACE AND ASTRONAUTICS, VOLS 1 AND 2, 2008, : 623 - +
  • [8] A new learning algorithm for feedforward neural networks
    Liu, DR
    Chang, TS
    Zhang, Y
    [J]. PROCEEDINGS OF THE 2001 IEEE INTERNATIONAL SYMPOSIUM ON INTELLIGENT CONTROL (ISIC'01), 2001, : 39 - 44
  • [9] ESOA Algorithm Based on learning rate optimization in Convolutional neural networks
    Wei, Peiyang
    Shi, Xiaoyu
    Zhou, Jiesan
    [J]. 2023 23RD IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW 2023, 2023, : 435 - 439
  • [10] A new stochastic learning algorithm for neural networks
    Koda, M
    Okano, H
    [J]. JOURNAL OF THE OPERATIONS RESEARCH SOCIETY OF JAPAN, 2000, 43 (04) : 469 - 485