A Self-Attentive Model with Gate Mechanism for Spoken Language Understanding

被引:0
|
作者
Li, Changliang [1 ]
Li, Liang [2 ]
Qi, Ji [1 ]
机构
[1] Kingsoft AI Lab, Beijing, Peoples R China
[2] Tsinghua Univ, Beijing, Peoples R China
关键词
RECURRENT NEURAL-NETWORKS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Spoken Language Understanding (SLU), which typically involves intent determination and slot filling, is a core component of spoken dialogue systems. Joint learning has shown to be effective in SLU given that slot tags and intents are supposed to share knowledge with each other. However, most existing joint learning methods only consider joint learning by sharing parameters on surface level rather than semantic level. In this work, we propose a novel self-attentive model with gate mechanism to fully utilize the semantic correlation between slot and intent. Our model first obtains intent-augmented embeddings based on neural network with self-attention mechanism. And then the intent semantic representation is utilized as the gate for labelling slot tags. The objectives of both tasks are optimized simultaneously via joint learning in an end-to-end way. We conduct experiment on popular benchmark ATIS. The results show that our model achieves state-of-the-art and outperforms other popular methods by a large margin in terms of both intent detection error rate and slot filling F1-score. This paper gives a new perspective for research on SLU.
引用
收藏
页码:3824 / 3833
页数:10
相关论文
共 50 条
  • [1] Energy-based Self-attentive Learning of Abstractive Communities for Spoken Language Understanding
    Shang, Guokan
    Tixier, Antoine J-P
    Vazirgiannis, Michalis
    Lorre, Jean-Pierre
    1ST CONFERENCE OF THE ASIA-PACIFIC CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 10TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (AACL-IJCNLP 2020), 2020, : 313 - 327
  • [2] A joint model based on interactive gate mechanism for spoken language understanding
    Chengai Sun
    Liangyu Lv
    Tailu Liu
    Tangjun Li
    Applied Intelligence, 2022, 52 : 6057 - 6064
  • [3] A joint model based on interactive gate mechanism for spoken language understanding
    Sun, Chengai
    Lv, Liangyu
    Liu, Tailu
    Li, Tangjun
    APPLIED INTELLIGENCE, 2022, 52 (06) : 6057 - 6064
  • [4] Sequential Self-Attentive Model for Knowledge Tracing
    Zhang, Xuelong
    Zhang, Juntao
    Lin, Nanzhou
    Yang, Xiandi
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 318 - 330
  • [5] SACNN: Self-attentive Convolutional Neural Network Model for Natural Language Inference
    Quamer, Waris
    Jain, Praphula Kumar
    Rai, Arpit
    Saravanan, Vijayalakshmi
    Pamula, Rajendra
    Kumar, Chiranjeev
    ACM TRANSACTIONS ON ASIAN AND LOW-RESOURCE LANGUAGE INFORMATION PROCESSING, 2021, 20 (03)
  • [6] Self-Attentive Associative Memory
    Le, Hung
    Tran, Truyen
    Venkatesh, Svetha
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [7] A self-attentive model for tracing knowledge and engagement in parallel
    Jiang, Hua
    Xiao, Bing
    Luo, Yintao
    Ma, Junliang
    PATTERN RECOGNITION LETTERS, 2023, 165 : 25 - 32
  • [8] Self-Attentive Sequential Recommendation
    Kang, Wang-Cheng
    McAuley, Julian
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 197 - 206
  • [9] Improving Disfluency Detection by Self-Training a Self-Attentive Model
    Lou, Paria Jamshid
    Johnson, Mark
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 3754 - 3763
  • [10] On the Robustness of Self-Attentive Models
    Hsieh, Yu-Lun
    Cheng, Minhao
    Juan, Da-Cheng
    Wei, Wei
    Hsu, Wen-Lian
    Hsieh, Cho-Jui
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 1520 - 1529