PETIS: Intent Classification and Slot Filling for Pet Care Services

被引:0
|
作者
Zaman, Namrah [1 ]
Park, Seong-Jin [2 ]
Won, Hyun-Sik [1 ]
Kim, Min-Ji [1 ]
An, Hee-Su [1 ]
Kim, Kang-Min [1 ,3 ]
机构
[1] Catholic Univ Korea, Dept Artificial Intelligence, Bucheon Si 14662, South Korea
[2] Catholic Univ Korea, Dept Math, Bucheon Si 14662, South Korea
[3] Catholic Univ Korea, Dept Data Sci, Bucheon 14662, South Korea
来源
IEEE ACCESS | 2024年 / 12卷
基金
新加坡国家研究基金会;
关键词
Intent recognition; Medical services; Multitasking; Prevention and mitigation; Artificial intelligence; Natural language processing; Animals; Online services; Conversational AI; intent classification; Korean language understanding; natural language understanding; parameter-efficient fine-tuning; pet care services; slot filling;
D O I
10.1109/ACCESS.2024.3452771
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
During the COVID-19 pandemic, the surge in online pet care services led to an increased demand for conversational AI systems specifically designed for the veterinary domain. However, traditional natural language understanding (NLU) tasks and datasets often fall short due to domain-specific terminology, the descriptive nature of user utterances, and the high cost of expert annotations. To fill this gap, we introduce PETIS, a novel dataset comprising 10,636 annotated utterances specifically designed for intent classification and slot filling in pet care domain, featuring 10 unique intent classes and 11 slot classes. PETIS addresses the scarcity of annotated data in this domain and serves as a challenging benchmark for evaluating NLU models. We demonstrate its effectiveness through experiments using state-of-the-art models, achieving 93.32 accuracy in intent classification and a Micro F1 (c) score of 91.21 in slot filling using multitask AdapterFusion. Furthermore, domain adaptation significantly enhanced performance, showcasing the potential of PETIS to drive research and development in conversational AI for online pet care services, offering a valuable resource for advancing the field.
引用
收藏
页码:124314 / 124329
页数:16
相关论文
共 50 条
  • [21] MISCA: A Joint Model for Multiple Intent Detection and Slot Filling with Intent-Slot Co-Attention
    Pham, Thinh
    Tran, Chi
    Nguyen, Dat Quoc
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (EMNLP 2023), 2023, : 12641 - 12650
  • [22] Leveraging Pretrained ASR Encoders for Efficient End-to-End Speech Intent Classification and Slot Filling
    Huang, He
    Balam, Jagadeesh
    Ginsburg, Boris
    INTERSPEECH 2023, 2023, : 2933 - 2937
  • [23] A Transformer based Multi-task Model for Domain Classification, Intent Detection and Slot-Filling
    Saha, Tulika
    Priya, Neeti
    Saha, Sriparna
    Bhattacharyya, Pushpak
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [24] Semantically Guided Enhanced Fusion for Intent Detection and Slot Filling
    Cai, Songtao
    Ma, Qicheng
    Hou, Yupeng
    Zeng, Guangping
    APPLIED SCIENCES-BASEL, 2023, 13 (22):
  • [25] A Neural Framework for Joint Prediction on Intent Identification and Slot Filling
    Shan, Jiawei
    Xu, Huayun
    Gong, Zeyang
    Su, Hanchen
    Han, Xu
    Li, Binyang
    COGNITIVE COMPUTING - ICCC 2019, 2019, 11518 : 12 - 25
  • [26] WAIS: Word Attention for Joint Intent Detection and Slot Filling
    Chen, Sixuan
    Yu, Shuai
    THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2019, : 9927 - 9928
  • [27] A novel model based on a transformer for intent detection and slot filling
    Dapeng Li
    Shuliang Wang
    Boxiang Zhao
    Zhiqiang Ma
    Leixiao Li
    Urban Informatics, 3 (1):
  • [28] Dirichlet variational autoencoder for joint slot filling and intent detection
    Gao, Wang
    Wang, Yu-Wei
    Zhang, Fan
    Fang, Yuan
    Journal of Computers (Taiwan), 2021, 32 (02) : 61 - 73
  • [29] Transformers for Multi-Intent Classification and Slot Filling of Supreme Court Decisions Related to Sexual Violence Law
    Munthuli, Adirek
    Socatiyanurak, Vorada
    Sangchocanonta, Sirikorn
    Kovudhikulrungsri, Lalin
    Saksakulkunakorn, Nantawat
    Chairuangsri, Phornkanok
    Tantibundhit, Charturong
    IEEE ACCESS, 2023, 11 : 76448 - 76467
  • [30] ESIE-BERT: Enriching sub-words information explicitly with BERT for intent classification and slot filling
    Guo, Yu
    Xie, Zhilong
    Chen, Xingyan
    Chen, Huangen
    Wang, Leilei
    Du, Huaming
    Wei, Shaopeng
    Zhao, Yu
    Li, Qing
    Wu, Gang
    NEUROCOMPUTING, 2024, 591