Unified Knowledge Prompt Pre-training for Customer Service Dialogues

被引:1
|
作者
He, Keqing [1 ]
Wang, Jingang [1 ]
Sun, Chaobo [1 ]
Wu, Wei [1 ]
机构
[1] Meituan Grp, Beijing, Peoples R China
关键词
dialogue pre-training; knowledge; prompt;
D O I
10.1145/3511808.3557718
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Dialogue bots have been widely applied in customer service scenarios to provide timely and user-friendly experience. These bots must classify the appropriate domain of a dialogue, understand the intent of users, and generate proper responses. Existing dialogue pre-training models are designed only for several dialogue tasks and ignore weakly-supervised expert knowledge in customer service dialogues. In this paper, we propose a novel unified knowledge prompt pre-training framework, UFA (Unified Model For All Tasks), for customer service dialogues. We formulate all the tasks of customer service dialogues as a unified text-to-text generation task and introduce a knowledge-driven prompt strategy to jointly learn from a mixture of distinct dialogue tasks. We pre-train UFA on a large-scale Chinese customer service corpus collected from practical scenarios and get significant improvements on both natural language understanding (NLU) and natural language generation (NLG) benchmarks.
引用
收藏
页码:4009 / 4013
页数:5
相关论文
共 50 条
  • [21] UniVIP: A Unified Framework for Self-Supervised Visual Pre-training
    Li, Zhaowen
    Zhu, Yousong
    Yang, Fan
    Li, Wei
    Zhao, Chaoyang
    Chen, Yingying
    Chen, Zhiyang
    Xie, Jiahao
    Wu, Liwei
    Zhao, Rui
    Tang, Ming
    Wang, Jinqiao
    2022 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2022), 2022, : 14607 - 14616
  • [22] UniTE: A Survey and Unified Pipeline for Pre-Training Spatiotemporal Trajectory Embeddings
    Lin, Yan
    Zhou, Zeyu
    Liu, Yicheng
    Lv, Haochen
    Wen, Haomin
    Li, Tianyi
    Li, Yushuai
    Jensen, Christian S.
    Guo, Shengnan
    Lin, Youfang
    Wan, Huaiyu
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2025, 37 (03) : 1475 - 1494
  • [23] Pre-training to Match for Unified Low-shot Relation Extraction
    Liu, Fangchao
    Lin, Hongyu
    Han, Xianpei
    Cao, Boxi
    Sun, Le
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 5785 - 5795
  • [24] UniXcoder: Unified Cross-Modal Pre-training for Code Representation
    Guo, Daya
    Lu, Shuai
    Duan, Nan
    Wang, Yanlin
    Zhou, Ming
    Yin, Jian
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 7212 - 7225
  • [25] Unified Vision-Language Pre-Training for Image Captioning and VQA
    Zhou, Luowei
    Palangi, Hamid
    Zhang, Lei
    Hu, Houdong
    Corso, Jason J.
    Gao, Jianfeng
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 13041 - 13049
  • [26] PRKG: Pre-Training Representation and Knowledge-Graph-Enhanced Web Service Recommendation for Mashup Creation
    Cao, Buqing
    Peng, Mi
    Xie, Ziming
    Liu, Jianxun
    Ye, Hongfan
    Li, Bing
    Fletcher, Kenneth K.
    IEEE TRANSACTIONS ON NETWORK AND SERVICE MANAGEMENT, 2024, 21 (02): : 1737 - 1749
  • [27] Knowledge Transfer via Pre-training for Recommendation: A Review and Prospect
    Zeng, Zheni
    Xiao, Chaojun
    Yao, Yuan
    Xie, Ruobing
    Liu, Zhiyuan
    Lin, Fen
    Lin, Leyu
    Sun, Maosong
    FRONTIERS IN BIG DATA, 2021, 4
  • [28] PLOME: Pre-training with Misspelled Knowledge for Chinese Spelling Correction
    Liu, Shulin
    Yang, Tao
    Yue, Tianchi
    Zhang, Feng
    Wang, Di
    59TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS AND THE 11TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (ACL-IJCNLP 2021), VOL 1, 2021, : 2991 - 3000
  • [29] Improving Knowledge Tracing via Pre-training Question Embeddings
    Liu, Yunfei
    Yang, Yang
    Chen, Xianyu
    Shen, Jian
    Zhang, Haifeng
    Yu, Yong
    PROCEEDINGS OF THE TWENTY-NINTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2020, : 1577 - 1583
  • [30] Progress in protein pre-training models integrating structural Knowledge
    Tang, Tian-Yi
    Xiong, Yi-Ming
    Zhang, Rui-Ge
    Zhang, Jian
    Li, Wen-Fei
    Wang, Jun
    Wang, Wei
    ACTA PHYSICA SINICA, 2024, 73 (18)