Solving ESL Sentence Completion Questions via Pre-trained Neural Language Models

被引:2
|
作者
Liu, Qiongqiong [1 ]
Liu, Tianqiao [1 ]
Zhao, Jiafu [1 ]
Fang, Qiang [1 ]
Ding, Wenbiao [1 ]
Wu, Zhongqin [1 ]
Xia, Feng [3 ]
Tang, Jiliang [2 ]
Liu, Zitao [1 ]
机构
[1] TAL Educ Grp, Beijing, Peoples R China
[2] Michigan State Univ, Data Sci & Engn Lab, E Lansing, MI 48824 USA
[3] Federat Univ Australia, Ballarat, Vic, Australia
基金
国家重点研发计划;
关键词
Sentence completion; Pre-trained language model; Neural networks;
D O I
10.1007/978-3-030-78270-2_46
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Sentence completion (SC) questions present a sentence with one or more blanks that need to be filled in, three to five possible words or phrases as options. SC questions are widely used for students learning English as a Second Language (ESL) and building computational approaches to automatically solve such questions is beneficial to language learners. In this work, we propose a neural framework to solve SC questions in English examinations by utilizing pre-trained language models. We conduct extensive experiments on a real-world K-12 ESL SC question dataset and the results demonstrate the superiority of our model in terms of prediction accuracy. Furthermore, we run precision-recall tradeoff analysis to discuss the practical issues when deploying it in real-life scenarios. To encourage reproducible results, we make our code publicly available at https://github.com/AIED2021/ESL-SentenceCompletion.
引用
收藏
页码:256 / 261
页数:6
相关论文
共 50 条
  • [1] On the Sentence Embeddings from Pre-trained Language Models
    Li, Bohan
    Zhou, Hao
    He, Junxian
    Wang, Mingxuan
    Yang, Yiming
    Li, Lei
    [J]. PROCEEDINGS OF THE 2020 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP), 2020, : 9119 - 9130
  • [2] Disentangling Semantics and Syntax in Sentence Embeddings with Pre-trained Language Models
    Huang, James Y.
    Huang, Kuan-Hao
    Chang, Kai-Wei
    [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 1372 - 1379
  • [3] Intelligent Completion of Ancient Texts Based on Pre-trained Language Models
    Li, Jiajun
    Ming, Can
    Guo, Zhihao
    Qian, Tieyun
    Peng, Zhiyong
    Wang, Xiaoguang
    Li, Xuhui
    Li, Jing
    [J]. Data Analysis and Knowledge Discovery, 2024, 8 (05) : 59 - 67
  • [4] Modeling Second Language Acquisition with pre-trained neural language models
    Palenzuela, Alvaro J. Jimenez
    Frasincar, Flavius
    Trusca, Maria Mihaela
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2022, 207
  • [5] SimKGC: Simple Contrastive Knowledge Graph Completion with Pre-trained Language Models
    Wang, Liang
    Zhao, Wei
    Wei, Zhuoyu
    Liu, Jingming
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4281 - 4294
  • [6] Multilingual Translation via Grafting Pre-trained Language Models
    Sun, Zewei
    Wang, Mingxuan
    Li, Lei
    [J]. FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, EMNLP 2021, 2021, : 2735 - 2747
  • [7] Compression of Generative Pre-trained Language Models via Quantization
    Tao, Chaofan
    Hou, Lu
    Zhang, Wei
    Shang, Lifeng
    Jiang, Xin
    Liu, Qun
    Luo, Ping
    Wong, Ngai
    [J]. PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 4821 - 4836
  • [8] Parallel Corpus Filtering via Pre-trained Language Models
    DiDi Labs
    [J]. arXiv, 2020,
  • [9] Pre-Trained Language Models and Their Applications
    Wang, Haifeng
    Li, Jiwei
    Wu, Hua
    Hovy, Eduard
    Sun, Yu
    [J]. ENGINEERING, 2023, 25 : 51 - 65
  • [10] Knowledge Base Grounded Pre-trained Language Models via Distillation
    Sourty, Raphael
    Moreno, Jose G.
    Servant, Francois-Paul
    Tamine, Lynda
    [J]. 39TH ANNUAL ACM SYMPOSIUM ON APPLIED COMPUTING, SAC 2024, 2024, : 1617 - 1625