Machine Reading Comprehension with Rich Knowledge

被引:2
|
作者
He, Jun [1 ,2 ]
Peng, Li [1 ]
Zhang, Yinghui [2 ]
Sun, Bo [1 ,2 ]
Xiao, Rong [1 ,3 ]
Xiao, Yongkang [1 ,3 ]
机构
[1] Beijing Normal Univ, Sch Artificial Intelligence, 19 Xinjiekouwai Rd, Beijing 100875, Peoples R China
[2] Beijing Normal Univ, Coll Educ Future, 18 Jinfeng Rd, Zhuhai 519087, Peoples R China
[3] Beijing Normal Univ, Intelligent Comp & Res Ctr, 19 Xinjiekouwai Rd, Beijing 100875, Peoples R China
基金
中国国家自然科学基金;
关键词
Deep learning; machine reading comprehension; structured knowledge; unstructured knowledge;
D O I
10.1142/S0218001422510041
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine reading comprehension (MRC) is a crucial and challenging task in natural language processing (NLP). With the development of deep learning, language models have achieved excellent results. However, these models still cannot answer complex questions. Currently, researchers often utilize structured knowledge, such as knowledge bases (KBs), as external knowledge by directly extracting triples to enhance the results of machine reading. Although they can support certain background knowledge, the triples are limited to the interrelationships among entities or words. Unlike structured knowledge, unstructured knowledge is rich and extensive. However, these methods ignore unstructured knowledge resources, such as Wikipedia. In addition, the effect of combining the two types of knowledge is still not known. In this study, we first attempt to explore the usefulness of combining them. We introduce a fusion mechanism into a rich knowledge fusion layer (RKF) to obtain more useful and relevant knowledge from different external knowledge resources. Further to promote interaction among different types of knowledge, a bi-matching layer is added. We propose the RKF-NET framework based on BERT, and our experimental results demonstrate the effectiveness of two classic datasets: SQuAD1.1 and the Easy-Challenge (ARC).
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Enhancing Pre-Trained Language Representations with Rich Knowledge for Machine Reading Comprehension
    Yang, An
    Wang, Quan
    Liu, Jing
    Liu, Kai
    Lyu, Yajuan
    Wu, Hua
    She, Qiaoqiao
    Li, Sujian
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 2346 - 2357
  • [2] Improving Machine Reading Comprehension with Contextualized Commonsense Knowledge
    Sun, Kai
    Yu, Dian
    Chen, Jianshu
    Yu, Dong
    Cardie, Claire
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 8736 - 8747
  • [3] Review of Conversational Machine Reading Comprehension for Knowledge Graph
    Hu, Juan
    Xi, Xuefeng
    Cui, Zhiming
    [J]. Computer Engineering and Applications, 2024, 60 (03) : 17 - 28
  • [4] Interpretable modular knowledge reasoning for machine reading comprehension
    Mucheng Ren
    Heyan Huang
    Yang Gao
    [J]. Neural Computing and Applications, 2022, 34 : 9901 - 9918
  • [5] Interpretable modular knowledge reasoning for machine reading comprehension
    Ren, Mucheng
    Huang, Heyan
    Gao, Yang
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (12): : 9901 - 9918
  • [6] Explicit Utilization of General Knowledge in Machine Reading Comprehension
    Wang, Chao
    Jiang, Hui
    [J]. 57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 2263 - 2272
  • [7] Towards Knowledge Enhanced Language Model for Machine Reading Comprehension
    Gong, Peizhu
    Liu, Jin
    Yang, Yihe
    He, Huihua
    [J]. IEEE ACCESS, 2020, 8 : 224837 - 224851
  • [8] Investigating Prior Knowledge for Challenging Chinese Machine Reading Comprehension
    Sun, Kai
    Yu, Dian
    Yu, Dong
    Cardie, Claire
    [J]. TRANSACTIONS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, 2020, 8 : 141 - 155
  • [9] Incorporating Common Knowledge and Specific Entity Linking Knowledge for Machine Reading Comprehension
    Han, Shoukang
    Gao, Neng
    Guo, Xiaobo
    Shan, Yiwei
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT III, 2021, 12817 : 546 - 558
  • [10] Towards Medical Machine Reading Comprehension with Structural Knowledge and Plain Text
    Li, Dongfang
    Hu, Baotian
    Chen, Qingcai
    Peng, Weihua
    Wang, Anqi
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 1427 - 1438