共 50 条
- [1] Research on machine reading comprehension based on pre-trained model [J]. International Journal of Reasoning-based Intelligent Systems, 2022, 14 (04): : 240 - 246
- [2] From Cloze to Comprehension: Retrofitting Pre-trained Masked Language Models to Pre-trained Machine Reader [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
- [3] Exploiting Diverse Information in Pre-Trained Language Model for Multi-Choice Machine Reading Comprehension [J]. APPLIED SCIENCES-BASEL, 2022, 12 (06):
- [4] HORNET: Enriching Pre-trained Language Representations with Heterogeneous Knowledge Sources [J]. PROCEEDINGS OF THE 30TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT, CIKM 2021, 2021, : 2608 - 2617
- [5] Pre-trained Language Model Representations for Language Generation [J]. 2019 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL HLT 2019), VOL. 1, 2019, : 4052 - 4059
- [6] Graph-combined Coreference Resolution Methods on Conversational Machine Reading Comprehension with Pre-trained Language Model [J]. PROCEEDINGS OF THE SECOND DIALDOC WORKSHOP ON DOCUMENT-GROUNDED DIALOGUE AND CONVERSATIONAL QUESTION ANSWERING (DIALDOC 2022), 2022, : 72 - 82
- [7] On the Language Neutrality of Pre-trained Multilingual Representations [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2020, 2020, : 1663 - 1674
- [9] Evaluating the Summarization Comprehension of Pre-Trained Language Models [J]. Lobachevskii Journal of Mathematics, 2023, 44 : 3028 - 3039