Mask and Cloze: Automatic Open Cloze Question Generation Using a Masked Language Model

被引:2
|
作者
Matsumori, Shoya [1 ]
Okuoka, Kohei [1 ]
Shibata, Ryoichi [1 ]
Inoue, Minami [1 ]
Fukuchi, Yosuke [2 ]
Imai, Michita [1 ]
机构
[1] Keio Univ, Yokohama 2238522, Japan
[2] Natl Inst Informat, Tokyo 1018430, Japan
关键词
Measurement; Testing; Computational modeling; Brain modeling; Training; Text categorization; question answering (information retrieval); Open cloze test; automatic question generation; masked language model; field study; COMPREHENSION;
D O I
10.1109/ACCESS.2023.3239005
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper conducts the first trial to apply a masked language AI model and the "Gini coefficient " to the field of English study. We propose an algorithm named CLOZER that generates open cloze questions that inquiry knowledge of English learners. Open cloze questions (OCQ) have been attracting attention for both measuring the ability and facilitating the learning of English learners. However, since OCQ is in free form, teachers have to ensure that only a ground truth answer and no additional words will be accepted in the blank. A remarkable benefit of CLOZER is to relieve teachers of the burden of producing OCQ. Moreover, CLOZER provides a self-study environment for English learners by automatically generating OCQ. We evaluated CLOZER through quantitative experiments on 1,600 answers and show its effectiveness statistically. Compared with human-generated questions, we also revealed that CLOZER can generate OCQs better than the average non-native English teacher. Additionally, we conducted a field study at a high school to clarify the benefits and hurdles when introducing CLOZER. Then, on the basis of our findings, we proposed several design improvements.
引用
收藏
页码:9835 / 9850
页数:16
相关论文
共 50 条
  • [31] Automatic item generation in various STEM subjects using large language model prompting
    Park, Joonhyeong (joonhyeong.park@nie.edu.sg), 2025, 8
  • [32] Detection for Cultural Difference in Impression Using Masked Language Model
    Pituxcoosuvarn, Mondheera
    Murakami, Yohei
    Miwa, Kaede
    CULTURE AND COMPUTING, C&C 2023, 2023, 14035 : 569 - 579
  • [33] Probabilistically Masked Language Model Capable of Autoregressive Generation in Arbitrary Word Order
    Liao, Yi
    Jiang, Xin
    Liu, Qun
    58TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2020), 2020, : 263 - 274
  • [34] Detection for Cultural Difference in Impression Using Masked Language Model
    Pituxcoosuvarn, Mondheera
    Murakami, Yohei
    Miwa, Kaede
    Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2023, 14035 LNCS : 569 - 579
  • [35] Retrieval-Augmented Generation Approach: Document Question Answering using Large Language Model
    Muludi, Kurnia
    Fitria, Kaira Milani
    Triloka, Joko
    Sutedi
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2024, 15 (03) : 776 - 785
  • [36] Using natural language generation in automatic route description
    Dale, R
    Geldof, S
    Prost, JP
    JOURNAL OF RESEARCH AND PRACTICE IN INFORMATION TECHNOLOGY, 2005, 37 (01): : 89 - 105
  • [37] Automatic Generation of Software Tools Using a Language Grammar
    Parker, Glenn A.
    IEEE SOUTHEASTCON 2018, 2018,
  • [38] QDG: A unified model for automatic question-distractor pairs generation
    Shuai, Pengju
    Li, Li
    Liu, Sishun
    Shen, Jun
    APPLIED INTELLIGENCE, 2023, 53 (07) : 8275 - 8285
  • [39] QDG: A unified model for automatic question-distractor pairs generation
    Pengju Shuai
    Li Li
    Sishun Liu
    Jun Shen
    Applied Intelligence, 2023, 53 : 8275 - 8285
  • [40] Objective Type Question Generation using Natural Language Processing
    Deena, G.
    Raja, K.
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (02) : 539 - 548