Encoding Syntactic Knowledge in Transformer Encoder for Intent Detection and Slot Filling

被引:0
|
作者
Wang, Jixuan [1 ,2 ,3 ]
Wei, Kai [3 ]
Radfar, Martin [3 ]
Zhang, Weiwei [3 ]
Chung, Clement [3 ]
机构
[1] Univ Toronto, Toronto, ON, Canada
[2] Vector Inst, Toronto, ON, Canada
[3] Amazon Alexa, Pittsburgh, PA 15205 USA
关键词
NEURAL-NETWORKS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We propose a novel Transformer encoder-based architecture with syntactical knowledge encoded for intent detection and slot filling. Specifically, we encode syntactic knowledge into the Transformer encoder by jointly training it to predict syntactic parse ancestors and part-of-speech of each token via multi-task learning. Our model is based on self-attention and feed-forward layers and does not require external syntactic information to be available at inference time. Experiments show that on two benchmark datasets, our models with only two Transformer encoder layers achieve state-of-the-art results. Compared to the previously best performed model without pre-training, our models achieve absolute F1 score and accuracy improvement of 1.59% and 0.85% for slot filling and intent detection on the SNIPS dataset, respectively. Our models also achieve absolute F1 score and accuracy improvement of 0.1% and 0.34% for slot filling and intent detection on the ATIS dataset, respectively, over the previously best performed model. Furthermore, the visualization of the self-attention weights illustrates the benefits of incorporating syntactic information during training.
引用
收藏
页码:13943 / 13951
页数:9
相关论文
共 50 条
  • [1] A novel model based on a transformer for intent detection and slot filling
    Dapeng Li
    Shuliang Wang
    Boxiang Zhao
    Zhiqiang Ma
    Leixiao Li
    Urban Informatics, 3 (1):
  • [2] A CO-INTERACTIVE TRANSFORMER FOR JOINT SLOT FILLING AND INTENT DETECTION
    Qin, Libo
    Liu, Tailu
    Che, Wanxiang
    Kang, Bingbing
    Zhao, Sendong
    Liu, Ting
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 8193 - 8197
  • [3] Multitask Learning with Knowledge Base for Joint Intent Detection and Slot Filling
    He, Ting
    Xu, Xiaohong
    Wu, Yating
    Wang, Huazhen
    Chen, Jian
    APPLIED SCIENCES-BASEL, 2021, 11 (11):
  • [4] Intent Detection and Slot Filling for Vietnamese
    Mai Hoang Dao
    Thinh Hung Truong
    Dat Quoc Nguyen
    INTERSPEECH 2021, 2021, : 4698 - 4702
  • [5] Joint model of intent detection and slot filling of knowledge question for crop diseases and pests using CNN-Transformer
    Wang, Lu
    Liu, Ruilin
    Huang, Jingzhong
    Guo, Xuchao
    Nongye Gongcheng Xuebao/Transactions of the Chinese Society of Agricultural Engineering, 2024, 40 (13): : 156 - 162
  • [6] From Disfluency Detection to Intent Detection and Slot Filling
    Mai Hoang Dao
    Thinh Hung Truong
    Dat Quoc Nguyen
    INTERSPEECH 2022, 2022, : 1106 - 1110
  • [7] Joint intent detection and slot filling with syntactic and semantic features using multichannel CNN-BiLSTM
    Muhammad, Yusuf Idris
    Salim, Naomie
    Zainal, Anazida
    PeerJ Computer Science, 2024, 10
  • [8] Joint Intent Detection and Slot Filling of Knowledge Question Answering for Agricultural Diseases and Pests
    Guo X.
    Hao X.
    Yao X.
    Li L.
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2023, 54 (01): : 205 - 215
  • [9] Explainable Abuse Detection as Intent Classification and Slot Filling
    Calabrese, Agostina
    Ross, Bjorn
    Lapata, Mirella
    TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2022, 10 : 1440 - 1454
  • [10] The Impact of Romanian Diacritics on Intent Detection and Slot Filling
    Stoica, Anda Diana
    Rad, Andrei-Cristian
    Muntean, Joan Horia
    Daian, George
    Lemnaru, Camelia
    Potolea, Rodica
    Dinsoreanu, Mihaela
    PROCEEDINGS OF 2020 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION, QUALITY AND TESTING, ROBOTICS (AQTR), 2020, : 467 - 472