Unified Knowledge Prompt Pre-training for Customer Service Dialogues

被引:1
|
作者
He, Keqing [1 ]
Wang, Jingang [1 ]
Sun, Chaobo [1 ]
Wu, Wei [1 ]
机构
[1] Meituan Grp, Beijing, Peoples R China
关键词
dialogue pre-training; knowledge; prompt;
D O I
10.1145/3511808.3557718
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dialogue bots have been widely applied in customer service scenarios to provide timely and user-friendly experience. These bots must classify the appropriate domain of a dialogue, understand the intent of users, and generate proper responses. Existing dialogue pre-training models are designed only for several dialogue tasks and ignore weakly-supervised expert knowledge in customer service dialogues. In this paper, we propose a novel unified knowledge prompt pre-training framework, UFA (Unified Model For All Tasks), for customer service dialogues. We formulate all the tasks of customer service dialogues as a unified text-to-text generation task and introduce a knowledge-driven prompt strategy to jointly learn from a mixture of distinct dialogue tasks. We pre-train UFA on a large-scale Chinese customer service corpus collected from practical scenarios and get significant improvements on both natural language understanding (NLU) and natural language generation (NLG) benchmarks.
引用
收藏
页码:4009 / 4013
页数:5
相关论文
共 50 条
  • [41] Unified building change detection pre-training method with masked semantic annotations
    Quan, Yujun
    Yu, Anzhu
    Guo, Wenyue
    Lu, Xuanbei
    Jiang, Bingchun
    Zheng, Shulei
    He, Peipei
    INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2023, 120
  • [42] Improving Knowledge Graph Representation Learning by Structure Contextual Pre-training
    Ye, Ganqiang
    Zhang, Wen
    Bi, Zhen
    Wong, Chi Man
    Chen, Hui
    Chen, Huajun
    PROCEEDINGS OF THE 10TH INTERNATIONAL JOINT CONFERENCE ON KNOWLEDGE GRAPHS (IJCKG 2021), 2021, : 151 - 155
  • [43] Knowledge-Based Neural Pre-training for Intelligent Document Management
    Margiotta, Daniele
    Croce, Danilo
    Rotoloni, Marco
    Cacciamani, Barbara
    Basili, Roberto
    AIXIA 2021 - ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, 13196 : 564 - 579
  • [44] Knowledge distilled pre-training model for vision-language-navigation
    Bo Huang
    Shuai Zhang
    Jitao Huang
    Yijun Yu
    Zhicai Shi
    Yujie Xiong
    Applied Intelligence, 2023, 53 : 5607 - 5619
  • [45] Lightweight Model Pre-Training via Language Guided Knowledge Distillation
    Li, Mingsheng
    Zhang, Lin
    Zhu, Mingzhen
    Huang, Zilong
    Yu, Gang
    Fan, Jiayuan
    Chen, Tao
    IEEE TRANSACTIONS ON MULTIMEDIA, 2024, 26 : 10720 - 10730
  • [46] Multi-stage Pre-training over Simplified Multimodal Pre-training Models
    Liu, Tongtong
    Feng, Fangxiang
    Wang, Xiaojie
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING, VOL 1 (ACL-IJCNLP 2021), 2021, : 2556 - 2565
  • [47] Retrieval-based Knowledge Augmented Vision Language Pre-training
    Rao, Jiahua
    Shan, Zifei
    Liu, Longpo
    Zhou, Yao
    Yang, Yuedong
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 5399 - 5409
  • [48] Virtual prompt pre-training for prototype-based few-shot relation extraction
    He, Kai
    Huang, Yucheng
    Mao, Rui
    Gong, Tieliang
    Li, Chen
    Cambria, Erik
    EXPERT SYSTEMS WITH APPLICATIONS, 2023, 213
  • [49] Prompt Pre-Training with Twenty-Thousand Classes for Open-Vocabulary Visual Recognition
    Ren, Shuhuai
    Zhang, Aston
    Zhu, Yi
    Zhang, Shuai
    Zheng, Shuai
    Li, Mu
    Smola, Alex
    Sun, Xu
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [50] Knowledge distilled pre-training model for vision-language-navigation
    Huang, Bo
    Zhang, Shuai
    Huang, Jitao
    Yu, Yijun
    Shi, Zhicai
    Xiong, Yujie
    APPLIED INTELLIGENCE, 2023, 53 (05) : 5607 - 5619