Distractor Generation Through Text-to-Text Transformer Models

被引:1
|
作者
de-Fitero-Dominguez, David [1 ]
Garcia-Lopez, Eva [1 ]
Garcia-Cabot, Antonio [1 ]
del-Hoyo-Gabaldon, Jesus-Angel [1 ]
Moreno-Cediel, Antonio [1 ]
机构
[1] Univ Alcala, Dept Ciencias Comp, Edificio Politecn, Alcala De Henares 28871, Madrid, Spain
关键词
Artificial intelligence; natural languages; natural language processing; computer applications; educational technology; MULTIPLE; CORRECT;
D O I
10.1109/ACCESS.2024.3361673
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years, transformer language models have made a significant impact on automatic text generation. This study focuses on the task of distractor generation in Spanish using a fine-tuned multilingual text-to-text model, namely mT5. Our method outperformed established baselines based on LSTM networks, confirming the effectiveness of Transformer architectures in such NLP tasks. While comparisons with other Transformer-based solutions yielded diverse outcomes based on the metric of choice, our method notably achieved superior results on the ROUGE metric compared to the GPT-2 approach. Although traditional evaluation metrics such as BLEU and ROUGE are commonly used, this paper argues for more context-sensitive metrics given the inherent variability in acceptable distractor generation results. Among the contributions of this research is a comprehensive comparison with other methods, an examination of the potential drawbacks of multilingual models, and the introduction of alternative evaluation metrics. Future research directions, derived from our findings and a review of related works are also suggested, with a particular emphasis on leveraging other language models and Transformer architectures.
引用
收藏
页码:25580 / 25589
页数:10
相关论文
共 50 条
  • [1] Assessing the Stability of Text-to-Text Models for Keyword Generation Tasks
    Walkowiak, Tomasz
    COMPUTATIONAL SCIENCE, ICCS 2024, PT III, 2024, 14834 : 112 - 119
  • [2] ViT5: Pretrained Text-to-Text Transformer for Vietnamese Language Generation
    Long Phan
    Hieu Tran
    Hieu Nguyen
    Trinh, Trieu H.
    NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP, 2022, : 136 - 142
  • [3] Homograph Disambiguation with Text-to-Text Transfer Transformer
    Rezackova, Marketa
    Tihelka, Daniel
    Matousek, Jindrich
    INTERSPEECH 2024, 2024, : 2785 - 2789
  • [4] End-to-End generation of Multiple-Choice questions using Text-to-Text transfer Transformer models
    Rodriguez-Torrealba, Ricardo
    Garcia-Lopez, Eva
    Garcia-Cabot, Antonio
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 208
  • [5] Text-To-Text Generation for Issue Report Classification
    Rejithkumar, Gokul
    Anish, Preethu Rose
    Ghaisas, Smita
    PROCEEDINGS 2024 ACM/IEEE INTERNATIONAL WORKSHOP ON NL-BASED SOFTWARE ENGINEERING, NLBSE 2024, 2024, : 53 - 56
  • [6] RST Discourse Parsing as Text-to-Text Generation
    Hu, Xinyu
    Wan, Xiaojun
    IEEE-ACM TRANSACTIONS ON AUDIO SPEECH AND LANGUAGE PROCESSING, 2023, 31 : 3278 - 3289
  • [7] Argument Mining as a Text-to-Text Generation Task
    Kawarada, Masayuki
    Hirao, Tsutomu
    Uchida, Wataru
    Nagata, Masaaki
    PROCEEDINGS OF THE 18TH CONFERENCE OF THE EUROPEAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, VOL 1: LONG PAPERS, 2024, : 2002 - 2014
  • [8] Deep copycat Networks for Text-to-Text Generation
    He, Julia
    Madhyastha, Pranava
    Specia, Lucia
    2019 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING AND THE 9TH INTERNATIONAL JOINT CONFERENCE ON NATURAL LANGUAGE PROCESSING (EMNLP-IJCNLP 2019): PROCEEDINGS OF THE CONFERENCE, 2019, : 3227 - 3236
  • [9] TFix: Learning to Fix Coding Errors with a Text-to-Text Transformer
    Berabi, Berkay
    He, Jingxuan
    Raychev, Veselin
    Vechev, Martin
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [10] Exploring the limits of transfer learning with a unified text-to-text transformer
    Raffel, Colin
    Shazeer, Noam
    Roberts, Adam
    Lee, Katherine
    Narang, Sharan
    Matena, Michael
    Zhou, Yanqi
    Li, Wei
    Liu, Peter J.
    Journal of Machine Learning Research, 2020, 21