Speed up Training of the Recurrent Neural Network Based on Constrained optimization Techniques

被引:1
|
作者
陈珂
包威权
迟惠生
机构
关键词
Recurrent neural network; adaptive learning rate; gradientbased algorithm;
D O I
暂无
中图分类号
TP393 [计算机网络];
学科分类号
081201 ; 1201 ;
摘要
In this paper, the constrained optimization technique for a substantial prob-lem is explored, that is accelerating training the globally recurrent neural net-work. Unlike most of the previous methods in feedforward neuxal networks, the authors adopt the constrained optimization technique to improve the gradiellt-based algorithm of the globally recuxrent neural network for the adaptive learn-ing rate during training. Using the recurrent network with the improved algo-rithm, some experiments in two real-world problems, namely filtering additive noises in acoustic data and classification of temporal signals for speaker identifi-cation, have been performed. The experimental results show that the recurrent neural network with the improved learning algorithm yields significantly faster training and achieves the satisfactory performance.
引用
收藏
页码:581 / 588
页数:8
相关论文
共 50 条
  • [1] Speed up training of the recurrent neural network based on constrained optimization techniques
    Peking Univ, Beijing, China
    J Comput Sci Technol, 6 (581-588):
  • [2] Recurrent neural network training with the Kalman Filter-based techniques
    Trebaticky, P
    NEURAL NETWORK WORLD, 2005, 15 (05) : 471 - 488
  • [3] Using Supercomputer to Speed Up Neural Network Training
    Yu, Yue
    Chi, Xuebin
    Jiang, Jinrong
    2016 IEEE 22ND INTERNATIONAL CONFERENCE ON PARALLEL AND DISTRIBUTED SYSTEMS (ICPADS), 2016, : 942 - 947
  • [4] A one-layer recurrent neural network for constrained nonconvex optimization
    Li, Guocheng
    Yan, Zheng
    Wang, Jun
    NEURAL NETWORKS, 2015, 61 : 10 - 21
  • [5] A Recurrent Neural Network Approach for Constrained Distributed Fuzzy Convex Optimization
    Liu, Jingxin
    Liao, Xiaofeng
    Dong, Jin-Song
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (07) : 9743 - 9757
  • [6] A One-Layer Recurrent Neural Network for Constrained Nonsmooth Optimization
    Liu, Qingshan
    Wang, Jun
    IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 2011, 41 (05): : 1323 - 1333
  • [7] A constrained optimization algorithm for training locally recurrent globally feedforward neural networks
    Mastorocostas, PA
    Proceedings of the International Joint Conference on Neural Networks (IJCNN), Vols 1-5, 2005, : 717 - 722
  • [8] A constrained optimization method based on BP neural network
    Zhang, Li
    Wang, Fulin
    Sun, Ting
    Xu, Bing
    NEURAL COMPUTING & APPLICATIONS, 2018, 29 (02): : 413 - 421
  • [9] A constrained optimization method based on BP neural network
    Li Zhang
    Fulin Wang
    Ting Sun
    Bing Xu
    Neural Computing and Applications, 2018, 29 : 413 - 421
  • [10] A novel neural network training technique based on a multi-algorithm constrained optimization strategy
    Karras, DA
    Lagaris, IE
    24TH EUROMICRO CONFERENCE - PROCEEDING, VOLS 1 AND 2, 1998, : 683 - 687