Distractor Generation Through Text-to-Text Transformer Models

被引:1
|
作者
de-Fitero-Dominguez, David [1 ]
Garcia-Lopez, Eva [1 ]
Garcia-Cabot, Antonio [1 ]
del-Hoyo-Gabaldon, Jesus-Angel [1 ]
Moreno-Cediel, Antonio [1 ]
机构
[1] Univ Alcala, Dept Ciencias Comp, Edificio Politecn, Alcala De Henares 28871, Madrid, Spain
关键词
Artificial intelligence; natural languages; natural language processing; computer applications; educational technology; MULTIPLE; CORRECT;
D O I
10.1109/ACCESS.2024.3361673
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In recent years, transformer language models have made a significant impact on automatic text generation. This study focuses on the task of distractor generation in Spanish using a fine-tuned multilingual text-to-text model, namely mT5. Our method outperformed established baselines based on LSTM networks, confirming the effectiveness of Transformer architectures in such NLP tasks. While comparisons with other Transformer-based solutions yielded diverse outcomes based on the metric of choice, our method notably achieved superior results on the ROUGE metric compared to the GPT-2 approach. Although traditional evaluation metrics such as BLEU and ROUGE are commonly used, this paper argues for more context-sensitive metrics given the inherent variability in acceptable distractor generation results. Among the contributions of this research is a comprehensive comparison with other methods, an examination of the potential drawbacks of multilingual models, and the introduction of alternative evaluation metrics. Future research directions, derived from our findings and a review of related works are also suggested, with a particular emphasis on leveraging other language models and Transformer architectures.
引用
收藏
页码:25580 / 25589
页数:10
相关论文
共 50 条
  • [31] Extract, Denoise and Enforce: Evaluating and Improving Concept Preservation for Text-to-Text Generation
    Mao, Yuning
    Ma, Wenchang
    Lei, Deren
    Han, Jiawei
    Ren, Xiang
    2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 5063 - 5074
  • [32] mT5: A Massively Multilingual Pre-trained Text-to-Text Transformer
    Xue, Linting
    Constant, Noah
    Roberts, Adam
    Kale, Mihir
    Al-Rfou, Rami
    Siddhant, Aditya
    Barua, Aditya
    Raffel, Colin
    2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 483 - 498
  • [33] Text-to-Text Generative Adversarial Networks
    Li, Changliang
    Su, Yixin
    Liu, Wenju
    2018 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2018,
  • [34] Automated Question and Answer Generation from Texts using Text-to-Text Transformers
    Rupali Goyal
    Parteek Kumar
    V. P. Singh
    Arabian Journal for Science and Engineering, 2024, 49 : 3027 - 3041
  • [35] Cross-Domain Transfer of Generative Explanations Using Text-to-Text Models
    Erliksson, Karl Fredrik
    Arpteg, Anders
    Matskin, Mihhail
    Payberah, Amir H.
    NATURAL LANGUAGE PROCESSING AND INFORMATION SYSTEMS (NLDB 2021), 2021, 12801 : 76 - 89
  • [36] KAT5: Knowledge-Aware Transfer Learning with a Text-to-Text Transfer Transformer
    Sohrab, Mohammad Golam
    Miwa, Makoto
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES-APPLIED DATA SCIENCE TRACK, PT IX, ECML PKDD 2024, 2024, 14949 : 157 - 173
  • [37] Training Text-to-Text Transformers with Privacy Guarantees
    Ponomareva, Natalia
    Bastings, Jasmijn
    Vassilvitskii, Sergei
    Proceedings of the Annual Meeting of the Association for Computational Linguistics, 2022, : 2182 - 2193
  • [38] Training Text-to-Text Transformers with Privacy Guarantees
    Ponomareva, Natalia
    Bastings, Jasmijn
    Vassilvitskii, Sergei
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), 2022, : 2182 - 2193
  • [39] VIHATET5: Enhancing Hate Speech Detection in Vietnamese With a Unified Text-to-Text Transformer Model
    Luan Thanh Nguyen
    FINDINGS OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: ACL 2024, 2024, : 5948 - 5961
  • [40] Text-To-Text Transfer Transformer Based Method for Generating Startup Scenarios for New Equipment in Power Grids
    Tao, Wenbiao
    Wang, Liang
    Meng, Qingmeng
    Li, Rui
    Han, Peng
    Shi, Yuxin
    Shan, Lianfei
    Geng, Xiaofei
    APPLIED ARTIFICIAL INTELLIGENCE, 2024, 38 (01)