SPOKEN LANGUAGE UNDERSTANDING USING LONG SHORT-TERM MEMORY NEURAL NETWORKS

被引:0
|
作者
Yao, Kaisheng [1 ]
Peng, Baolin [1 ]
Zhang, Yu [1 ]
Yu, Dong [1 ]
Zweig, Geoffrey [1 ]
Shi, Yangyang [1 ]
机构
[1] Microsoft, Redmond, WA 98052 USA
关键词
Recurrent neural networks; long short-term memory; language understanding; ALGORITHM;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Neural network based approaches have recently produced record-setting performances in natural language understanding tasks such as word labeling. In the word labeling task, a tagger is used to assign a label to each word in an input sequence. Specifically, simple recurrent neural networks (RNNs) and convolutional neural networks (CNNs) have shown to significantly outperform the previous state-of-the-art - conditional random fields (CRFs). This paper investigates using long short-term memory (LSTM) neural networks, which contain input, output and forgetting gates and are more advanced than simple RNN, for the word labeling task. To explicitly model output-label dependence, we propose a regression model on top of the LSTM un-normalized scores. We also propose to apply deep LSTM to the task. We investigated the relative importance of each gate in the LSTM by setting other gates to a constant and only learning particular gates. Experiments on the ATIS dataset validated the effectiveness of the proposed models.
引用
收藏
页码:189 / 194
页数:6
相关论文
共 50 条
  • [1] Short-Term Traffic Prediction Using Long Short-Term Memory Neural Networks
    Abbas, Zainab
    Al-Shishtawy, Ahmad
    Girdzijauskas, Sarunas
    Vlassov, Vladimir
    [J]. 2018 IEEE INTERNATIONAL CONGRESS ON BIG DATA (IEEE BIGDATA CONGRESS), 2018, : 57 - 65
  • [2] Recognition of Sign Language System for Indonesian Language Using Long Short-Term Memory Neural Networks
    Rakun, Erdefi
    Arymurthy, Aniati M.
    Stefanus, Lim Y.
    Wicaksono, Alfan F.
    Wisesa, I. Wayan W.
    [J]. ADVANCED SCIENCE LETTERS, 2018, 24 (02) : 999 - 1004
  • [3] Language Identification in Short Utterances Using Long Short-Term Memory (LSTM) Recurrent Neural Networks
    Zazo, Ruben
    Lozano-Diez, Alicia
    Gonzalez-Dominguez, Javier
    Toledano, Doroteo T.
    Gonzalez-Rodriguez, Joaquin
    [J]. PLOS ONE, 2016, 11 (01):
  • [4] Dialog State Tracking Using Long Short-term Memory Neural Networks
    Yang, Xiaohao
    Liu, Jia
    [J]. 16TH ANNUAL CONFERENCE OF THE INTERNATIONAL SPEECH COMMUNICATION ASSOCIATION (INTERSPEECH 2015), VOLS 1-5, 2015, : 1800 - 1804
  • [5] An Incremental Learning Approach Using Long Short-Term Memory Neural Networks
    Lemos Neto, Alvaro C.
    Coelho, Rodrigo A.
    de Castro, Cristiano L.
    [J]. JOURNAL OF CONTROL AUTOMATION AND ELECTRICAL SYSTEMS, 2022, 33 (05) : 1457 - 1465
  • [6] Deflated reputation using multiplicative long short-term memory neural networks
    Ma, Yixuan
    Zhang, Zhenji
    Li, Deming
    Tang, Mincong
    [J]. FUTURE GENERATION COMPUTER SYSTEMS-THE INTERNATIONAL JOURNAL OF ESCIENCE, 2021, 118 : 198 - 207
  • [7] An Incremental Learning Approach Using Long Short-Term Memory Neural Networks
    Álvaro C. Lemos Neto
    Rodrigo A. Coelho
    Cristiano L. de Castro
    [J]. Journal of Control, Automation and Electrical Systems, 2022, 33 : 1457 - 1465
  • [8] Predicting Marimba Stickings Using Long Short-Term Memory Neural Networks
    Chong, Jet Kye
    Correa, Debora
    [J]. AI 2022: ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, 13728 : 339 - 352
  • [9] Accelerating Inference In Long Short-Term Memory Neural Networks
    Mealey, Thomas
    Taha, Tarek M.
    [J]. NAECON 2018 - IEEE NATIONAL AEROSPACE AND ELECTRONICS CONFERENCE, 2018, : 382 - 390
  • [10] Recurrent Neural Networks with External Memory for Spoken Language Understanding
    Peng, Baolin
    Yao, Kaisheng
    Jing, Li
    Wong, Kam-Fai
    [J]. NATURAL LANGUAGE PROCESSING AND CHINESE COMPUTING, NLPCC 2015, 2015, 9362 : 25 - 35