MISCA: A Joint Model for Multiple Intent Detection and Slot Filling with Intent-Slot Co-Attention

被引:0
|
作者
Pham, Thinh [1 ]
Tran, Chi [1 ]
Nguyen, Dat Quoc [1 ]
机构
[1] VinAI Res, Ho Chi Minh City, Vietnam
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The research study of detecting multiple intents and filling slots is becoming more popular because of its relevance to complicated real-world situations. Recent advanced approaches, which are joint models based on graphs, might still face two potential issues: (i) the uncertainty introduced by constructing graphs based on preliminary intents and slots, which may transfer intent-slot correlation information to incorrect label node destinations, and (ii) direct incorporation of multiple intent labels for each token w.r.t. token-level intent voting might potentially lead to incorrect slot predictions, thereby hurting the overall performance. To address these two issues, we propose a joint model named MISCA. Our MISCA introduces an intent-slot co-attention mechanism and an underlying layer of label attention mechanism. These mechanisms enable MISCA to effectively capture correlations between intents and slot labels, eliminating the need for graph construction. They also facilitate the transfer of correlation information in both directions: from intents to slots and from slots to intents, through multiple levels of label-specific representations, without relying on token-level intent information. Experimental results show that MISCA outperforms previous models, achieving new state-of-the-art overall accuracy performances on two benchmark datasets MixATIS and MixSNIPS. This highlights the effectiveness of our attention mechanisms.
引用
收藏
页码:12641 / 12650
页数:10
相关论文
共 50 条
  • [21] Automatic Intent-Slot Induction for Dialogue Systems
    Zeng, Zengfeng
    Ma, Dan
    Yang, Haiqin
    Gou, Zhen
    Shen, Jianping
    PROCEEDINGS OF THE WORLD WIDE WEB CONFERENCE 2021 (WWW 2021), 2021, : 2578 - 2589
  • [22] A novel model based on a transformer for intent detection and slot filling
    Dapeng Li
    Shuliang Wang
    Boxiang Zhao
    Zhiqiang Ma
    Leixiao Li
    Urban Informatics, 3 (1):
  • [23] Multitask Learning with Knowledge Base for Joint Intent Detection and Slot Filling
    He, Ting
    Xu, Xiaohong
    Wu, Yating
    Wang, Huazhen
    Chen, Jian
    APPLIED SCIENCES-BASEL, 2021, 11 (11):
  • [24] Joint intent detection and slot filling for Turkish natural language understanding
    Buyuk, Osman
    TURKISH JOURNAL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCES, 2023, 31 (05) : 844 - 859
  • [25] Task Conditioned BERT for Joint Intent Detection and Slot-Filling
    Tavares, Diogo
    Azevedo, Pedro
    Semedo, David
    Sousa, Ricardo
    Magalhaes, Joao
    PROGRESS IN ARTIFICIAL INTELLIGENCE, EPIA 2023, PT I, 2023, 14115 : 467 - 480
  • [26] Joint Slot Filling and Intent Detection via Capsule Neural Networks
    Zhang, Chenwei
    Li, Yaliang
    Du, Nan
    Fan, Wei
    Yu, Philip S.
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5259 - 5267
  • [27] SlotRefine: A Fast Non-Autoregressive Model for Joint Intent Detection and Slot Filling
    Wu, Di
    Ding, Liang
    Lu, Fan
    Xie, Jian
    PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1932 - 1937
  • [28] A Novel Bi-directional Interrelated Model for Joint Intent Detection and Slot Filling
    E, Haihong
    Niu, Peiqing
    Chen, Zhongfu
    Song, Meina
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 5467 - 5471
  • [29] Joint Training Model of Intent Detection and Slot Filling for Multi Granularity Implicit Guidance
    Li, Bin
    Wang, Weihua
    Bao, Feilong
    2022 INTERNATIONAL CONFERENCE ON ASIAN LANGUAGE PROCESSING (IALP 2022), 2022, : 271 - 274
  • [30] From Disfluency Detection to Intent Detection and Slot Filling
    Mai Hoang Dao
    Thinh Hung Truong
    Dat Quoc Nguyen
    INTERSPEECH 2022, 2022, : 1106 - 1110