Incremental BERT with commonsense representations for multi-choice reading comprehension

被引:0
|
作者
Ronghan Li
Lifang Wang
Zejun Jiang
Dong Liu
Meng Zhao
Xinyu Lu
机构
[1] Northwestern Polytechnical University,School of Computer Science and Engineering
来源
关键词
Machine reading comprehension; BERT; External knowledge; Common sense; Deep learning;
D O I
暂无
中图分类号
学科分类号
摘要
Compared to extractive machine reading comprehension (MRC) limited to text spans, multi-choice MRC is more flexible in evaluating the model’s ability to utilize external commonsense knowledge. On the one hand, existing methods leverage transfer learning and complicated matching networks to solve the multi-choice MRC, which lacks interpretability for commonsense questions. On the other hand, although Transformer based pre-trained language models such as BERT have shown powerful performance in MRC, external knowledge such as unspoken commonsense and world knowledge still can not be used explicitly for downstream tasks. In this work, we present three simple yet effective injection methods plugged in BERT’s structure to fine-tune the multi-choice MRC tasks with off-the-shelf commonsense representations directly. Moreover, we introduce a mask mechanism for the token-level multi-hop relationship searching to filter external knowledge. Experimental results indicate that the incremental BERT outperforms the baseline by a considerable margin on DREAM and CosmosQA, two knowledge-driven multi-choice datasets. Further analysis shows the robustness of the incremental model in the case of an incomplete training set.
引用
收藏
页码:32311 / 32333
页数:22
相关论文
共 50 条
  • [1] Incremental BERT with commonsense representations for multi-choice reading comprehension
    Li, Ronghan
    Wang, Lifang
    Jiang, Zejun
    Liu, Dong
    Zhao, Meng
    Lu, Xinyu
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2021, 80 (21-23) : 32311 - 32333
  • [2] Option Attentive Capsule Network for Multi-choice Reading Comprehension
    Miao, Hang
    Liu, Ruifang
    Gao, Sheng
    [J]. NEURAL INFORMATION PROCESSING (ICONIP 2019), PT III, 2019, 11955 : 306 - 318
  • [3] Leveraging greater relations for improving multi-choice reading comprehension
    Hong Yan
    Lijun Liu
    Xupeng Feng
    Qingsong Huang
    [J]. Neural Computing and Applications, 2022, 34 : 20851 - 20864
  • [4] Leveraging greater relations for improving multi-choice reading comprehension
    Yan, Hong
    Liu, Lijun
    Feng, Xupeng
    Huang, Qingsong
    [J]. NEURAL COMPUTING & APPLICATIONS, 2022, 34 (23): : 20851 - 20864
  • [5] A Co-Matching Model for Multi-choice Reading Comprehension
    Wang, Shuohang
    Yu, Mo
    Chang, Shiyu
    Jiang, Jing
    [J]. PROCEEDINGS OF THE 56TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 2, 2018, : 746 - 751
  • [6] Multi-Grained Evidence Inference for Multi-Choice Reading Comprehension
    Zhao, Yilin
    Zhao, Hai
    Duan, Sufeng
    [J]. IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 3896 - 3907
  • [7] A New Multi-choice Reading Comprehension Dataset for Curriculum Learning
    Liang, Yichan
    Li, Jianheng
    Yin, Jian
    [J]. ASIAN CONFERENCE ON MACHINE LEARNING, VOL 101, 2019, 101 : 742 - 757
  • [8] SSIN: Sentence Semantic Interaction Network for Multi-Choice Reading Comprehension
    Xu, Xiaobo
    Tohti, Turdi
    Hamdulla, Askar
    [J]. IEEE ACCESS, 2022, 10 : 113915 - 113922
  • [9] DIMN: Dual Integrated Matching Network for multi-choice reading comprehension
    Wei, Qiang
    Ma, Kun
    Liu, Xinyu
    Ji, Ke
    Yang, Bo
    Abraham, Ajith
    [J]. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2024, 130
  • [10] MMM: Multi-Stage Multi-Task Learning for Multi-Choice Reading Comprehension
    Jin, Di
    Gao, Shuyang
    Kao, Jiun-Yu
    Chung, Tagyoung
    Hakkani-tur, Dilek
    [J]. THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 8010 - 8017