Effective Batching for Recurrent Neural Network Grammars

被引:0
|
作者
Noji, Hiroshi [1 ]
Oseki, Yohei [2 ]
机构
[1] AIST, Artificial Intelligence Res Ctr, Tokyo, Japan
[2] Univ Tokyo, Grad Sch Arts & Sci, Tokyo, Japan
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
As a language model that integrates traditional symbolic operations and flexible neural representations, recurrent neural network grammars (RNNGs) have attracted great attention from both scientific and engineering perspectives. However, RNNGs are known to be harder to scale due to the difficulty of batched training. In this paper, we propose effective batching for RNNGs, where every operation is computed in parallel with tensors across multiple sentences. Our PyTorch implementation effectively employs a GPU and achieves x6 speedup compared to the existing C++ DyNet implementation with model-independent auto-batching. Moreover, our batched RNNG also accelerates inference and achieves x20-150 speedup for beam search depending on beam sizes. Finally, we evaluate syntactic generalization performance of the scaled RNNG against the LSTM baseline, based on the large training data of 100M tokens from English Wikipedia and the broad-coverage targeted syntactic evaluation benchmark.(1)
引用
收藏
页码:4340 / 4352
页数:13
相关论文
共 50 条
  • [1] Unsupervised Recurrent Neural Network Grammars
    Kim, Yoon
    Rush, Alexander M.
    Yu, Lei
    Kuncoro, Adhiguna
    Dyer, Chris
    Melis, Gabor
    2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 1105 - 1117
  • [2] Localizing syntactic predictions using recurrent neural network grammars
    Brennan, Jonathan R.
    Dyer, Chris
    Kuncoro, Adhiguna
    Hale, John T.
    NEUROPSYCHOLOGIA, 2020, 146
  • [3] Semantic Graph Parsing with Recurrent Neural Network DAG Grammars
    Fancellu, Federico
    Gilroy, Sorcha
    Lopez, Adam
    Lapata, Mirella
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 2769 - 2778
  • [4] What Do Recurrent Neural Network Grammars Learn About Syntax?
    Kuncoro, Adhiguna
    Ballesteros, Miguel
    Kong, Lingpeng
    Dyer, Chris
    Neubig, Graham
    Smith, Noah A.
    15TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EACL 2017), VOL 1: LONG PAPERS, 2017, : 1249 - 1258
  • [5] Localizing Syntactic Composition with Left-Corner Recurrent Neural Network Grammars
    Sugimoto, Yushi
    Yoshida, Ryo
    Jeong, Hyeonjeong
    Koizumi, Masatoshi
    Brennan, Jonathan R.
    Oseki, Yohei
    NEUROBIOLOGY OF LANGUAGE, 2024, 5 (01): : 201 - 224
  • [6] Modeling Human Sentence Processing with Left-Corner Recurrent Neural Network Grammars
    Yoshida, Ryo
    Noji, Hiroshi
    Oseki, Yohei
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 2964 - 2973
  • [7] A neural network approach for batching decisions in wafer fabrication
    Sung, CS
    Choung, YI
    INTERNATIONAL JOURNAL OF PRODUCTION RESEARCH, 1999, 37 (13) : 3101 - 3114
  • [8] Effective multinational trade forecasting using LSTM recurrent neural network
    Shen M.-L.
    Lee C.-F.
    Liu H.-H.
    Chang P.-Y.
    Yang C.-H.
    Expert Systems with Applications, 2021, 182
  • [9] Effective multinational trade forecasting using LSTM recurrent neural network
    Shen, Mei-Li
    Lee, Cheng-Feng
    Liu, Hsiou-Hsiang
    Chang, Po-Yin
    Yang, Cheng-Hong
    EXPERT SYSTEMS WITH APPLICATIONS, 2021, 182
  • [10] Can recurrent neural networks learn natural language grammars?
    Lawrence, S
    Giles, CL
    Fong, S
    ICNN - 1996 IEEE INTERNATIONAL CONFERENCE ON NEURAL NETWORKS, VOLS. 1-4, 1996, : 1853 - 1858