Effective Batching for Recurrent Neural Network Grammars

被引:0
|
作者
Noji, Hiroshi [1 ]
Oseki, Yohei [2 ]
机构
[1] AIST, Artificial Intelligence Res Ctr, Tokyo, Japan
[2] Univ Tokyo, Grad Sch Arts & Sci, Tokyo, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a language model that integrates traditional symbolic operations and flexible neural representations, recurrent neural network grammars (RNNGs) have attracted great attention from both scientific and engineering perspectives. However, RNNGs are known to be harder to scale due to the difficulty of batched training. In this paper, we propose effective batching for RNNGs, where every operation is computed in parallel with tensors across multiple sentences. Our PyTorch implementation effectively employs a GPU and achieves x6 speedup compared to the existing C++ DyNet implementation with model-independent auto-batching. Moreover, our batched RNNG also accelerates inference and achieves x20-150 speedup for beam search depending on beam sizes. Finally, we evaluate syntactic generalization performance of the scaled RNNG against the LSTM baseline, based on the large training data of 100M tokens from English Wikipedia and the broad-coverage targeted syntactic evaluation benchmark.(1)
引用
收藏
页码:4340 / 4352
页数:13
相关论文
共 50 条
  • [41] Recurrent neural network wave functions
    Hibat-Allah, Mohamed
    Ganahl, Martin
    Hayward, Lauren E.
    Melko, Roger G.
    Carrasquilla, Juan
    PHYSICAL REVIEW RESEARCH, 2020, 2 (02):
  • [42] Time Adaptive Recurrent Neural Network
    Kag, Anil
    Saligrama, Venkatesh
    2021 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR 2021, 2021, : 15144 - 15153
  • [43] Recurrent Neural Network with Dynamic Memory
    Bai, Jiaqi
    Dong, Tao
    Liao, Xiaofeng
    Mu, Nankun
    ADVANCES IN NEURAL NETWORKS - ISNN 2018, 2018, 10878 : 339 - 345
  • [44] A Dynamically Stabilized Recurrent Neural Network
    Samer Saab
    Yiwei Fu
    Asok Ray
    Michael Hauser
    Neural Processing Letters, 2022, 54 : 1195 - 1209
  • [45] LEARNING IN THE RECURRENT RANDOM NEURAL NETWORK
    GELENBE, E
    NEURAL COMPUTATION, 1993, 5 (01) : 154 - 164
  • [46] A RECURRENT NEURAL NETWORK - LIMITATIONS AND TRAINING
    LEVIN, E
    PROCEEDINGS OF THE 22ND CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS, VOLS 1 & 2, 1988, : 296 - 301
  • [47] Effective and Efficient Spiking Recurrent Neural Networks
    Yin, Bojian
    Corradi, Federico
    Bohte, Sander
    ERCIM NEWS, 2021, (125): : 9 - 10
  • [48] Analysis of Recurrent Neural Network and Predictions
    Park, Jieun
    Yi, Dokkyun
    Ji, Sangmin
    SYMMETRY-BASEL, 2020, 12 (04):
  • [49] Chaotic diagonal recurrent neural network
    Wang Xing-Yuan
    Zhang Yi
    CHINESE PHYSICS B, 2012, 21 (03)
  • [50] PARAPHRASTIC RECURRENT NEURAL NETWORK LANGUAGE
    Liu, X.
    Chen, X.
    Gales, M. J. F.
    Woodland, P. C.
    2015 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING (ICASSP), 2015, : 5406 - 5410