Neural response generation for task completion using conversational knowledge graph

被引:1
|
作者
Ahmad, Zishan [1 ]
Ekbal, Asif [1 ]
Sengupta, Shubhashis [2 ]
Bhattacharyya, Pushpak [3 ]
机构
[1] Indian Inst Technol Patna, Dept Comp Sci & Engn, AI NLP ML Lab, Patna, Bihar, India
[2] Accenture, Accenture Technol Labs, Bangalore, Karnataka, India
[3] Indian Inst Technol, Dept Comp Sci & Technol, Mumbai, Maharashtra, India
来源
PLOS ONE | 2023年 / 18卷 / 02期
关键词
D O I
10.1371/journal.pone.0269856
中图分类号
O [数理科学和化学]; P [天文学、地球科学]; Q [生物科学]; N [自然科学总论];
学科分类号
07 ; 0710 ; 09 ;
摘要
Effective dialogue generation for task completion is challenging to build. The task requires the response generation system to generate the responses consistent with intent and slot values, have diversity in response and be able to handle multiple domains. The response also needs to be context relevant with respect to the previous utterances in the conversation. In this paper, we build six different models containing Bi-directional Long Short Term Memory (Bi-LSTM) and Bidirectional Encoder Representations from Transformers (BERT) based encoders. To effectively generate the correct slot values, we implement a copy mechanism at the decoder side. To capture the conversation context and the current state of the conversation we introduce a simple heuristic to build a conversational knowledge graph. Using this novel algorithm we are able to capture important aspects in a conversation. This conversational knowledge-graph is then used by our response generation model to generate more relevant and consistent responses. Using this knowledge-graph we do not need the entire utterance history, rather only the last utterance to capture the conversational context. We conduct experiments showing the effectiveness of the knowledge-graph in capturing the context and generating good response. We compare these results against hierarchical-encoder-decoder models and show that the use of triples from the conversational knowledge-graph is an effective method to capture context and the user requirement. Using this knowledge-graph we show an average performance gain of 0.75 BLEU score across different models. Similar results also hold true across different manual evaluation metrics.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] Introducing Graph Neural Networks for Few-Shot Relation Prediction in Knowledge Graph Completion Task
    Wang, Yashen
    Zhang, Huanhuan
    KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 294 - 306
  • [2] Few-shot Low-resource Knowledge Graph Completion with Reinforced Task Generation
    Pei, Shichao
    Zhang, Qiannan
    Zhang, Xiangliang
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2023), 2023, : 7252 - 7264
  • [3] An Overview of Research on Knowledge Graph Completion Based on Graph Neural Network
    Yue W.
    Haichun S.
    Data Analysis and Knowledge Discovery, 2024, 8 (03) : 10 - 28
  • [4] A Flexible Simplicity Enhancement Model for Knowledge Graph Completion Task
    Wang, Yashen
    Zhang, Xuecheng
    Chen, Tianzhu
    Zhang, Yi
    ARTIFICIAL INTELLIGENCE, CICAI 2023, PT II, 2024, 14474 : 298 - 309
  • [5] Relational Graph Neural Network with Hierarchical Attention for Knowledge Graph Completion
    Zhang, Zhao
    Zhuang, Fuzhen
    Zhu, Hengshu
    Shi, Zhiping
    Xiong, Hui
    He, Qing
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 9612 - 9619
  • [6] Disentangled Relational Graph Neural Network with Contrastive Learning for knowledge graph completion
    Yin, Hong
    Zhong, Jiang
    Li, Rongzhen
    Li, Xue
    KNOWLEDGE-BASED SYSTEMS, 2024, 295
  • [7] NePTuNe: Neural Powered Tucker Network for Knowledge Graph Completion
    Sonkar, Shashank
    Katiyar, Arzoo
    Baraniuk, Richard
    PROCEEDINGS OF THE 10TH INTERNATIONAL JOINT CONFERENCE ON KNOWLEDGE GRAPHS (IJCKG 2021), 2021, : 177 - 180
  • [8] Multi-Task Learning and Improved TextRank for Knowledge Graph Completion
    Tian, Hao
    Zhang, Xiaoxiong
    Wang, Yuhan
    Zeng, Daojian
    ENTROPY, 2022, 24 (10)
  • [9] From Discrimination to Generation: Knowledge Graph Completion with Generative Transformer
    Xie, Xin
    Zhang, Ningyu
    Li, Zhoubo
    Deng, Shumin
    Chen, Hui
    Xiong, Feiyu
    Chen, Mosha
    Chen, Huajun
    COMPANION PROCEEDINGS OF THE WEB CONFERENCE 2022, WWW 2022 COMPANION, 2022, : 162 - 165
  • [10] Shared Embedding Based Neural Networks for Knowledge Graph Completion
    Guan, Saiping
    Jin, Xiaolong
    Wang, Yuanzhuo
    Cheng, Xueqi
    CIKM'18: PROCEEDINGS OF THE 27TH ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, 2018, : 247 - 256