Evolution of Reading Comprehension and Question Answering Systems

被引:3
|
作者
Krishnamoorthy, Venkatesh [1 ]
机构
[1] Missouri Univ Sci & Technol, Rolla, MO 65409 USA
关键词
Knowledge Graph; BERT; ELMo; Attention; LSTM; Transformer; RNN; SQUaD; Multi-Hop; HayStack; Transfer Learning;
D O I
10.1016/j.procs.2021.05.024
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Natural Language Processing (NLP) has witnessed considerable advances in textual understanding through statistical and rule-based techniques, which have now yielded to developments in the field of neural networks and deep learning. The paper surveys research relevant to the topic of Reading Comprehension and Question Answering (QA) implementations. The initial focus of the paper is on Attention and Transformer models. A brief description of the architectures is presented highlighting the essence of the 'Attention is all you Need' paper wherein the authors of Bidirectional Encoding Transformer (BERT) have elucidated the significant departure from the recurrence concept of Recurrent Neural Networks (RNN). Subsequently the trends in Open Domain Question Answering (ODQA) which mark the progression from the passage- based question answering is presented. Of particular interest is Haystack which is an end-to-end open-source framework for Question Answering & Neural search. This field seems a promising avenue for a more intelligent form of 'search'. In a nutshell, the paper weaves through RNNs, Long Short-Term Memory (LSTM), and the currently trending Attention based Transformer models in NLP. Finally, we dwell on more contemporary pieces of research such as ODQA, Multi-Hop QA, evaluation using Adversarial Networks. (c) 2021 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license (https://creativecommons.org/licenses/by-nc-nd/4.0) Peer-review under responsibility of the scientific committee of the Complex Adaptive Systems Conference, June 2021.
引用
收藏
页码:231 / 238
页数:8
相关论文
共 50 条
  • [1] Visual Question Answering as Reading Comprehension
    Li, Hui
    Wang, Peng
    Shen, Chunhua
    van den Hengel, Anton
    [J]. 2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 6312 - 6321
  • [2] Reading comprehension based question answering technique by focusing on identifying question intention
    Otsuka, Atsushi
    Nishida, Kyosuke
    Saito, Itsumi
    Asano, Hisako
    Tomita, Junji
    Satoh, Tetsuji
    [J]. Transactions of the Japanese Society for Artificial Intelligence, 2019, 34 (05)
  • [3] Gated Self-Matching Networks for Reading Comprehension and Question Answering
    Wang, Wenhui
    Yang, Nan
    Wei, Furu
    Chang, Baobao
    Zhou, Ming
    [J]. PROCEEDINGS OF THE 55TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2017), VOL 1, 2017, : 189 - 198
  • [4] UDDIPOK: A reading comprehension based question answering dataset in Bangla language
    Aurpa, Tanjim Taharat
    Ahmed, Md Shoaib
    Rifat, Richita Khandakar
    Anwar, Md. Musfique
    Ali, A. B. M. Shawkat
    [J]. DATA IN BRIEF, 2023, 47
  • [5] Medical Exam Question Answering with Large-Scale Reading Comprehension
    Zhang, Xiao
    Wu, Ji
    He, Zhiyang
    Liu, Xien
    Su, Ying
    [J]. THIRTY-SECOND AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTIETH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / EIGHTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2018, : 5706 - 5713
  • [6] Impact of question-answering tasks on search processes and reading comprehension
    Cerdan, Raquel
    Vidal-Abarca, Eduardo
    Martinez, Tomas
    Gilabert, Ramiro
    Gil, Laura
    [J]. LEARNING AND INSTRUCTION, 2009, 19 (01) : 13 - 27
  • [7] A Reading Comprehension Style Question Answering Model Based On Attention Mechanism
    Xiao, Linlong
    Wang, Nanzhi
    Yang, Guocai
    [J]. 2018 IEEE 29TH INTERNATIONAL CONFERENCE ON APPLICATION-SPECIFIC SYSTEMS, ARCHITECTURES AND PROCESSORS (ASAP), 2018, : 61 - 64
  • [8] MGRC: An End-to-End Multigranularity Reading Comprehension Model for Question Answering
    Liu, Qian
    Geng, Xiubo
    Huang, Heyan
    Qin, Tao
    Lu, Jie
    Jiang, Daxin
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2023, 34 (05) : 2594 - 2605
  • [9] Cross-Lingual Question Answering over Knowledge Base as Reading Comprehension
    Zhang, Chen
    Lai, Yuxuan
    Feng, Yansong
    Shen, Xingyu
    Du, Haowei
    Zhao, Dongyan
    [J]. 17TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EACL 2023, 2023, : 2439 - 2452
  • [10] A rule-based Chinese question answering system for reading comprehension tests
    Hao, Xiaoyan
    Chang, Xiaoming
    Liu, Kaiying
    [J]. 2007 THIRD INTERNATIONAL CONFERENCE ON INTELLIGENT INFORMATION HIDING AND MULTIMEDIA SIGNAL PROCESSING, VOL II, PROCEEDINGS, 2007, : 325 - +