Efficient training of Time Delay Neural Networks for sequential patterns

被引:7
|
作者
Cancelliere, R
Gemello, R
机构
[1] CSELT SPA,I-10148 TURIN,ITALY
[2] UNIV TURIN,DEPT MATH,I-10123 TURIN,ITALY
关键词
TDNN; sequential patterns; efficient training;
D O I
10.1016/0925-2312(95)00044-5
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Time Delay Neural Networks are an extension of the classical multi-layer perceptron with time-delayed links. They are used to deal with sequence recognition problems in which a finite memory of past events is sufficient. Usually Time Delay Neural Networks are trained by performing a complete spatial expansion of delayed links through time to reconduct the training to that of a feedforward network. But this complete expansion is unnecessary. In fact it is sufficient to combine a partial spatial expansion with a sliding input window to obtain the same result. In this way we exploit the computational efficiency of standard backpropagation while increasing the flexibility of the method to deal with variable length sequences and reducing the storage occupation. In this paper a general training algorithm for Time Delay Neural Networks is presented, showing in detail the formal differences with respects to error backpropagation for feedforward networks. Furthermore, an efficient implementation is described, which exploits a partial spatial expansion of delayed links, showing formally its equivalence with the general algorithm.
引用
收藏
页码:33 / 42
页数:10
相关论文
共 50 条
  • [1] Efficient keyword spotting using time delay neural networks
    Myer, Samuel
    Tomar, Vikrant Singh
    [J]. 19TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2018), VOLS 1-6: SPEECH RESEARCH FOR EMERGING MARKETS IN MULTILINGUAL SOCIETIES, 2018, : 1264 - 1268
  • [2] Neural networks handling sequential patterns
    Yamasaki, T
    Kataoka, Y
    Kameyama, K
    Nakano, K
    [J]. INFORMATION SCIENCES, 2004, 159 (3-4) : 141 - 154
  • [3] Discovering Sequential Patterns by Neural Networks
    Nowak, Jakub
    Korytkowski, Marcin
    Scherer, Rafal
    [J]. 2020 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2020,
  • [4] Firing patterns transition and desynchronization induced by time delay in neural networks
    Huang, Shoufang
    Zhang, Jiqian
    Wang, Maosheng
    Hu, Chin-Kun
    [J]. PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2018, 499 : 88 - 97
  • [5] Continuous time delay neural networks for detection of temporal patterns in signals
    Derakhshani, R
    Schuckers, SAC
    [J]. 2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 2723 - 2728
  • [6] Sequential Training of Neural Networks With Gradient Boosting
    Emami, Seyedsaman
    Martinez-Munoz, Gonzalo
    [J]. IEEE ACCESS, 2023, 11 : 42738 - 42750
  • [7] Recognition of sequential patterns by nonmonotonic neural networks
    Morita, Masahiko
    Murakami, Satoshi
    [J]. Systems and Computers in Japan, 1999, 30 (04): : 11 - 19
  • [8] Fast time delay neural networks
    El-Bakry, HM
    Zhao, QF
    [J]. INTERNATIONAL JOURNAL OF NEURAL SYSTEMS, 2005, 15 (06) : 445 - 455
  • [9] Efficient training of backpropagation neural networks
    Otair, Mohammed A.
    Salameh, Walid A.
    [J]. NEURAL NETWORK WORLD, 2006, 16 (04) : 291 - 311
  • [10] Fast and Efficient and Training of Neural Networks
    Yu, Hao
    Wilamowski
    [J]. 3RD INTERNATIONAL CONFERENCE ON HUMAN SYSTEM INTERACTION, 2010, : 175 - 181