Enhancing Paraphrase Question Generation With Prior Knowledge

被引:0
|
作者
Xie, Jiayuan [1 ]
Fang, Wenhao [1 ]
Huang, Qingbao [4 ]
Cai, Yi [1 ,2 ,3 ]
Wang, Tao [5 ]
机构
[1] South China Univ Technol, Sch Software Engn, Guangzhou 510006, Peoples R China
[2] South China Univ Technol, Key Lab Big Data & Intelligent Robot, Minist Educ, Guangzhou 510006, Peoples R China
[3] Pengcheng Lab, Shenzhen 322099, Peoples R China
[4] Guangxi Univ, Sch Elect Engn, Nanning 530004, Peoples R China
[5] Kings Coll London, Dept Biostat & Hlth Informat, London 530004, England
基金
中国国家自然科学基金;
关键词
Task analysis; Decoding; Speech processing; Logic gates; Knowledge based systems; Deep learning; Internet; Paraphrase question generation; prior knowledge; discriminator module;
D O I
10.1109/TASLP.2022.3221045
中图分类号
O42 [声学];
学科分类号
070206 ; 082403 ;
摘要
Paraphrase question generation (PQG) aims to rewrite a given original question to a new paraphrase question, where the paraphrase question needs to have the same expressed meaning as the original question, but have a difference in expression form. Existing methods on PQG mainly focus on synonym substitution or word order adjustment based on the original question. However, rewriting based on the word-level may not guarantee the difference between paraphrase questions and original questions. In this paper, we propose a knowledge-aware paraphrase question generation model. Our model first employs a knowledge extractor to extract the prior knowledge related to the original question from the knowledge base. Then an attention mechanism and a gate mechanism are introduced in our model to selectively utilize the extracted prior knowledge for rewriting, which helps to expand the content of the generated question to maximize the difference. Additionally, we use a discriminator module to promote the generated paraphrase to be semantically close to the original question and the ground truth. Specifically, the loss function of the discriminator penalizes the excessive distance between the representation of the paraphrase question and the ground truth. Extensive experiments on the Quora dataset show that the proposed model outperforms the baselines. Further, our model is applied to the SQuAD dataset, which proves the generalization ability of our model in the existing QA dataset.
引用
收藏
页码:1464 / 1475
页数:12
相关论文
共 50 条
  • [1] Generating Paraphrase with Topic as Prior Knowledge
    Liu, Yuanxin
    Lin, Zheng
    Liu, Fenglin
    Dai, Qinyun
    Wang, Weiping
    PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2381 - 2384
  • [2] Paraphrase Generation Based on Lexical Knowledge and Features for a Natural Language Question Answering System
    Oh, Kyo-Joong
    Choi, Ho-Jin
    Gweon, Gahgene
    Heo, Jeong
    Ryu, Pum-Mo
    2015 INTERNATIONAL CONFERENCE ON BIG DATA AND SMART COMPUTING (BIGCOMP), 2015, : 35 - 38
  • [3] A Question Generation Method Based on Subgraph Paraphrase
    Wen, Li-Qiang
    Xiong, Guan-Ming
    Wang, Yu
    Chen, Yi-Pu
    Li, Wei-Ping
    Zhao, Wen
    Tien Tzu Hsueh Pao/Acta Electronica Sinica, 2024, 52 (10): : 3578 - 3588
  • [4] A Paraphrase Generation System for EHR Question Answering
    Soni, Sarvesh
    Roberts, Kirk
    SIGBIOMED WORKSHOP ON BIOMEDICAL NATURAL LANGUAGE PROCESSING (BIONLP 2019), 2019, : 20 - 29
  • [5] Integrating Linguistic Knowledge to Sentence Paraphrase Generation
    Lin, Zibo
    Li, Ziran
    Ding, Ning
    Zheng, Hai-Tao
    Shen, Ying
    Wang, Wei
    Zhao, Cong-Zhi
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8368 - 8375
  • [6] Linguistic steganography with knowledge-poor paraphrase generation
    Kermanidis, Katia Lida
    LITERARY AND LINGUISTIC COMPUTING, 2011, 26 (04): : 417 - 434
  • [7] Dejavu: Enhancing Videoconferencing with Prior Knowledge
    Hu, Pan
    Misra, Rakesh
    Katti, Sachin
    HOTMOBILE '19 - PROCEEDINGS OF THE 20TH INTERNATIONAL WORKSHOP ON MOBILE COMPUTING SYSTEMS AND APPLICATIONS, 2019, : 63 - 68
  • [8] Recognition of prior learning: exploring the 'knowledge question'
    Cooper, Linda
    Harris, Judy
    INTERNATIONAL JOURNAL OF LIFELONG EDUCATION, 2013, 32 (04) : 447 - 463
  • [9] Enhancing SPARQL Query Generation for Knowledge Base Question Answering Systems by Learning to Correct Triplets
    Qi, Jiexing
    Su, Chang
    Guo, Zhixin
    Wu, Lyuwen
    Shen, Zanwei
    Fu, Luoyi
    Wang, Xinbing
    Zhou, Chenghu
    APPLIED SCIENCES-BASEL, 2024, 14 (04):
  • [10] Enhancing vision data using prior knowledge
    Redford, AH
    Khalili, K
    FLEXIBLE AUTOMATION AND INTELLIGENT MANUFACTURING 1996, 1996, : 197 - 205