Neural response generation for task completion using conversational knowledge graph

被引:1
|
作者
Ahmad, Zishan [1 ]
Ekbal, Asif [1 ]
Sengupta, Shubhashis [2 ]
Bhattacharyya, Pushpak [3 ]
机构
[1] Indian Inst Technol Patna, Dept Comp Sci & Engn, AI NLP ML Lab, Patna, Bihar, India
[2] Accenture, Accenture Technol Labs, Bangalore, Karnataka, India
[3] Indian Inst Technol, Dept Comp Sci & Technol, Mumbai, Maharashtra, India
来源
PLOS ONE | 2023年 / 18卷 / 02期
关键词
D O I
10.1371/journal.pone.0269856
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Effective dialogue generation for task completion is challenging to build. The task requires the response generation system to generate the responses consistent with intent and slot values, have diversity in response and be able to handle multiple domains. The response also needs to be context relevant with respect to the previous utterances in the conversation. In this paper, we build six different models containing Bi-directional Long Short Term Memory (Bi-LSTM) and Bidirectional Encoder Representations from Transformers (BERT) based encoders. To effectively generate the correct slot values, we implement a copy mechanism at the decoder side. To capture the conversation context and the current state of the conversation we introduce a simple heuristic to build a conversational knowledge graph. Using this novel algorithm we are able to capture important aspects in a conversation. This conversational knowledge-graph is then used by our response generation model to generate more relevant and consistent responses. Using this knowledge-graph we do not need the entire utterance history, rather only the last utterance to capture the conversational context. We conduct experiments showing the effectiveness of the knowledge-graph in capturing the context and generating good response. We compare these results against hierarchical-encoder-decoder models and show that the use of triples from the conversational knowledge-graph is an effective method to capture context and the user requirement. Using this knowledge-graph we show an average performance gain of 0.75 BLEU score across different models. Similar results also hold true across different manual evaluation metrics.
引用
收藏
页数:18
相关论文
共 50 条
  • [31] Multi-Scale Convolutional Neural Network for Temporal Knowledge Graph Completion
    Wei Liu
    Peijie Wang
    Zhihui Zhang
    Qiong Liu
    Cognitive Computation, 2023, 15 : 1016 - 1022
  • [32] Multi-Scale Convolutional Neural Network for Temporal Knowledge Graph Completion
    Liu, Wei
    Wang, Peijie
    Zhang, Zhihui
    Liu, Qiong
    COGNITIVE COMPUTATION, 2023, 15 (03) : 1016 - 1022
  • [33] Progressive Distillation Based on Masked Generation Feature Method for Knowledge Graph Completion
    Fan, Cunhang
    Chen, Yujie
    Xue, Jun
    Kong, Yonghui
    Tao, Jianhua
    Lv, Zhao
    THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 8, 2024, : 8380 - 8388
  • [34] A Comparative Study of Knowledge Graph-to-Text Generation Architectures in the Context of Conversational Agents
    Ghanem, Hussam
    Cruz, Christophe
    COMPLEX NETWORKS & THEIR APPLICATIONS XII, VOL 1, COMPLEX NETWORKS 2023, 2024, 1141 : 413 - 426
  • [35] ParamE: Regarding Neural Network Parameters as Relation Embeddings for Knowledge Graph Completion
    Che, Feihu
    Zhang, Dawei
    Tao, Jianhua
    Niu, Mingyue
    Zhao, Bocheng
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 2774 - 2781
  • [36] Research on Knowledge Graph Completion Based upon Knowledge Graph Embedding
    Feng, Tuoyu
    Wu, Yongsheng
    Li, Libing
    2024 9TH INTERNATIONAL CONFERENCE ON COMPUTER AND COMMUNICATION SYSTEMS, ICCCS 2024, 2024, : 1335 - 1342
  • [37] Double-Branch Multi-Attention based Graph Neural Network for Knowledge Graph Completion
    Xu, Hongcai
    Bao, Junpeng
    Liu, Wenbo
    PROCEEDINGS OF THE 61ST ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023): LONG PAPERS, VOL 1, 2023, : 15257 - 15271
  • [38] PRGNN: Modeling high-order proximity with relational graph neural network for knowledge graph completion
    Zhu, Danhao
    NEUROCOMPUTING, 2024, 594
  • [39] Rethinking Graph Convolutional Networks in Knowledge Graph Completion
    Zhang, Zhanqiu
    Wang, Jie
    Ye, Jieping
    Wu, Feng
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 798 - 807
  • [40] Explicit Knowledge Graph Reasoning for Conversational Recommendation
    Ren, Xuhui
    Chen, Tong
    Nguyen, Quoc Viet Hung
    Cui, Lizhen
    Huang, Zi
    Yin, Hongzhi
    ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY, 2024, 15 (04)