Learning Transferable Features for Open-Domain Question Answering

被引:0
|
作者
Zuin, Gianlucca [1 ]
Chaimowicz, Luiz [1 ]
Veloso, Adriano [1 ]
机构
[1] Univ Fed Minas Gerais, Dept Comp Sci, Belo Horizonte, MG, Brazil
基金
欧盟地平线“2020”;
关键词
Question-Answering; Transfer Learning; Deep Networks;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Corpora used to learn open-domain Question-Answering (QA) models are typically collected from a wide variety of topics or domains. Since QA requires understanding natural language, open-domain QA models generally need very large training corpora. A simple way to alleviate data demand is to restrict the domain covered by the QA model, leading thus to domain-specific QA models. While learning improved QA models for a specific domain is still challenging due to the lack of sufficient training data in the topic of interest, additional training data can be obtained from related topic domains. Thus, instead of learning a single open-domain QA model, we investigate domain adaptation approaches in order to create multiple improved domain-specific QA models. We demonstrate that this can be achieved by stratifying the source dataset, without the need of searching for complementary data unlike many other domain adaptation approaches. We propose a deep architecture that jointly exploits convolutional and recurrent networks for learning domain-specific features while transferring domain-shared features. That is, we use transferable features to enable model adaptation from multiple source domains. We consider different transference approaches designed to learn span-level and sentence-level QA models. We found that domain-adaptation greatly improves sentence-level QA performance, and span-level QA benefits from sentence information. Finally, we also show that a simple clustering algorithm may be employed when the topic domains are unknown and the resulting loss in accuracy is negligible.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Learning to Transform, Combine, and Reason in Open-Domain Question Answering
    Dehghani, Mostafa
    Azarbonyad, Hosein
    Kamps, Jaap
    de Rijke, Maarten
    [J]. PROCEEDINGS OF THE TWELFTH ACM INTERNATIONAL CONFERENCE ON WEB SEARCH AND DATA MINING (WSDM'19), 2019, : 681 - 689
  • [2] Learning Strategies for Open-Domain Natural Language Question Answering
    Grois, Eugene
    Wilkins, David C.
    [J]. 19TH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI-05), 2005, : 1054 - 1060
  • [3] Type checking in open-domain question answering
    Schlobach, S
    Olsthoorn, M
    de Rijke, M
    [J]. ECAI 2004: 16TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 2004, 110 : 398 - 402
  • [4] Ranking and Sampling in Open-Domain Question Answering
    Xu, Yanfu
    Lin, Zheng
    Liu, Yuanxin
    Liu, Rui
    Wang, Weiping
    Meng, Dan
    [J]. 2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 2412 - 2421
  • [5] Passage filtering for open-domain Question Answering
    Noguera, Elisa
    Llopis, Fernando
    Ferrandez, Antonio
    [J]. ADVANCES IN NATURAL LANGUAGE PROCESSING, PROCEEDINGS, 2006, 4139 : 534 - 540
  • [6] A Light Ranker for Open-Domain Question Answering
    Qiu, Boyu
    Xu, Jungang
    Chen, Xu
    Sun, Yingfei
    [J]. 2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [7] PyGaggle: A Gaggle of Resources for Open-Domain Question Answering
    Pradeep, Ronak
    Chen, Haonan
    Gu, Lingwei
    Tamber, Manveer Singh
    Lin, Jimmy
    [J]. ADVANCES IN INFORMATION RETRIEVAL, ECIR 2023, PT III, 2023, 13982 : 148 - 162
  • [8] Adaptive Information Seeking for Open-Domain Question Answering
    Zhu, Yunchang
    Pang, Liang
    Lan, Yanyan
    Shen, Huawei
    Cheng, Xueqi
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 3615 - 3626
  • [9] RRQA: reconfirmed reader for open-domain question answering
    Li, Shi
    Zhang, Wenqian
    [J]. APPLIED INTELLIGENCE, 2023, 53 (15) : 18420 - 18430
  • [10] Dense Hierarchical Retrieval for Open-Domain Question Answering
    Liu, Ye
    Hashimoto, Kazuma
    Zhou, Yingbo
    Yavuz, Semih
    Xiong, Caiming
    Yu, Philip S.
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 188 - 200