To improve the training time of BP neural networks

被引:0
|
作者
Yu, CC [1 ]
Tang, YC [1 ]
机构
[1] Hsiuping Inst Technol, Dept Elect Engn, Taichung 412, Taiwan
关键词
neural network; BP algorithm; learning rate; training pairs;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
It is one of the most important tasks to improve the training time in the back-propagation (BP) neural networks. In this paper two new methods based on error back propagation by adopting dynamic adjusting weights for reduction the training time are presented. These approaches are based on an adequate modification of the traditional and classical method. Some interesting results of computer experiments with the modified BP algorithm are provided. These results prove that these new methods are effective to solve some problems and faster than the traditional methods in training multi-layer feed-forward neural networks.
引用
收藏
页码:C473 / C479
页数:7
相关论文
共 50 条
  • [1] Improve of BP neural networks model
    [J]. Xitong Gongcheng Lilum yu Shijian, 1 (67-71):
  • [2] Parallel training algorithm of BP neural networks
    Li, J
    Li, YX
    Xu, JW
    Zhang, JB
    [J]. PROCEEDINGS OF THE 3RD WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-5, 2000, : 872 - 876
  • [3] GENOUD-BP: A novel training algorithm for artificial neural networks
    Yang, Zhiyong
    Zhang, Shiyuan
    Zhang, Taohong
    [J]. PROCEEDINGS OF THE 5TH INTERNATIONAL CONFERENCE ON INFORMATION ENGINEERING FOR MECHANICS AND MATERIALS, 2015, 21 : 908 - 912
  • [4] An Analysis of Instance Selection for Neural Networks to Improve Training Speed
    Sun, Xunhu
    Chan, Philip K.
    [J]. 2014 13TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND APPLICATIONS (ICMLA), 2014, : 288 - 293
  • [5] Social relationship prediction across networks using tri-training BP neural networks
    Liu, Qun
    Liu, Shuxin
    Wang, Guoyin
    Xia, Shuyin
    [J]. NEUROCOMPUTING, 2020, 401 : 377 - 391
  • [6] On the BP Training Algorithm of Fuzzy Neural Networks (FNNs) via Its Equivalent Fully Connected Neural Networks (FFNNs)
    Wang, Jing
    Wang, Chi-Hsu
    Chen, C. L. Philip
    [J]. 2011 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2011, : 1376 - 1381
  • [7] A TIME ENCODING APPROACH TO TRAINING SPIKING NEURAL NETWORKS
    Adam, Karen
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 5957 - 5961
  • [8] Online Training Through Time for Spiking Neural Networks
    Xiao, Mingqing
    Meng, Qingyan
    Zhang, Zongpeng
    He, Di
    Lin, Zhouchen
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [9] Application of BP neural networks to transition detection in time series models
    Emoto, Takahiro
    Akutagawa, Masatake
    Kinouchi, Yohsuke
    Abeyratne, Udantha R.
    Nagashino, Hirofumi
    [J]. WMSCI 2005: 9th World Multi-Conference on Systemics, Cybernetics and Informatics, Vol 6, 2005, : 360 - 365
  • [10] Retrospective Loss: Looking Back to Improve Training of Deep Neural Networks
    Jandial, Surgan
    Chopra, Ayush
    Sarkar, Mausoom
    Gupta, Piyush
    Krishnamurthy, Balaji
    Balasubramanian, Vineeth
    [J]. KDD '20: PROCEEDINGS OF THE 26TH ACM SIGKDD INTERNATIONAL CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2020, : 1123 - 1131