CONVERSATIONAL QUERY REWRITING WITH SELF-SUPERVISED LEARNING

被引:3
|
作者
Liu, Hang [1 ]
Chen, Meng [1 ]
Wu, Youzheng [1 ]
He, Xiaodong [1 ]
Zhou, Bowen [1 ]
机构
[1] JD AI, Beijing, Peoples R China
基金
国家重点研发计划;
关键词
conversational query rewriting; self-supervised learning; multi-turn dialogue;
D O I
10.1109/ICASSP39728.2021.9413557
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Context modeling plays a critical role in building multi-turn dialogue systems. Conversational Query Rewriting (CQR) aims to simplify the multi-turn dialogue modeling into a single-turn problem by explicitly rewriting the conversational query into a self-contained utterance. However, existing approaches rely on massive supervised training data, which is labor-intensive to annotate. And the detection of the omitted important information from context can be further improved. Besides, intent consistency constraint between contextual query and rewritten query is also ignored. To tackle these issues, we first propose to construct a large-scale CQR dataset automatically via self-supervised learning, which does not need human annotation. Then we introduce a novel CQR model Teresa based on Transformer, which is enhanced by self-attentive keywords detection and intent consistency constraint. Finally, we conduct extensive experiments on two public datasets. Experimental results demonstrate that our proposed model outperforms existing CQR baselines significantly, and also prove the effectiveness of self-supervised learning on improving the CQR performance.
引用
收藏
页码:7628 / 7632
页数:5
相关论文
共 50 条
  • [1] Self-Supervised learning for Conversational Recommendation
    Li, Shuokai
    Xie, Ruobing
    Zhu, Yongchun
    Zhuang, Fuzhen
    Tang, Zhenwei
    Zhao, Wayne Xin
    He, Qing
    [J]. INFORMATION PROCESSING & MANAGEMENT, 2022, 59 (06)
  • [2] LARGE-CONTEXT CONVERSATIONAL REPRESENTATION LEARNING: SELF-SUPERVISED LEARNING FOR CONVERSATIONAL DOCUMENTS
    Masumura, Ryo
    Makishima, Naoki
    Ihori, Mana
    Takashima, Akihiko
    Tanaka, Tomohiro
    Orihashi, Shota
    [J]. 2021 IEEE SPOKEN LANGUAGE TECHNOLOGY WORKSHOP (SLT), 2021, : 1012 - 1019
  • [3] Self-supervised Dialogue Learning for Spoken Conversational Question Answering
    Chen, Nuo
    You, Chenyu
    Zou, Yuexian
    [J]. INTERSPEECH 2021, 2021, : 231 - 235
  • [4] Self-Supervised Contrastive Learning for Efficient User Satisfaction Prediction in Conversational Agents
    Kachuee, Mohammad
    Yuan, Hao
    Kim, Young-Bum
    Lee, Sungjin
    [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 4053 - 4064
  • [5] Self-Supervised Query Reformulation for Code Search
    Mao, Yuetian
    Wan, Chengcheng
    Jiang, Yuze
    Gu, Xiaodong
    [J]. PROCEEDINGS OF THE 31ST ACM JOINT MEETING EUROPEAN SOFTWARE ENGINEERING CONFERENCE AND SYMPOSIUM ON THE FOUNDATIONS OF SOFTWARE ENGINEERING, ESEC/FSE 2023, 2023, : 363 - 374
  • [6] Self-Supervised Dialogue Learning
    Wu, Jiawei
    Wang, Xin
    Wang, William Yang
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3857 - 3867
  • [7] Longitudinal self-supervised learning
    Zhao, Qingyu
    Liu, Zixuan
    Adeli, Ehsan
    Pohl, Kilian M.
    [J]. MEDICAL IMAGE ANALYSIS, 2021, 71
  • [8] Self-Supervised Learning for Electroencephalography
    Rafiei, Mohammad H.
    Gauthier, Lynne V.
    Adeli, Hojjat
    Takabi, Daniel
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 1457 - 1471
  • [9] Self-Supervised Learning for Recommendation
    Huang, Chao
    Xia, Lianghao
    Wang, Xiang
    He, Xiangnan
    Yin, Dawei
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 5136 - 5139
  • [10] Credal Self-Supervised Learning
    Lienen, Julian
    Huellermeier, Eyke
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34