Math Word Problem Solver Based on Text-to-Text Transformer Model

被引:0
|
作者
Yang, Chuanzhi [1 ]
Huang, Runze [1 ]
Yu, Xinguo [2 ]
Peng, Rao [2 ]
机构
[1] Wuhan Univ Technol, Sch Comp Sci & Technol, Wuhan, Peoples R China
[2] Cent China Normal Univ, Natl Engn Res Ctr E Learning, Wuhan, Peoples R China
基金
中国国家自然科学基金;
关键词
math word problem; problem solver; text-to-text transformer model; deep learning;
D O I
10.1109/TALE52509.2021.9678686
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
In recent years, automatic problem solving for mathematical words has attracted increasing attention. Therefore, algorithms developed for solving mathematical word problems like humans are a key technology to facilitate the development of digital education. In this paper, a deep learning model based on a text-to-text conversion model is proposed to solve mathematical word problems. The deep learning model treats each mathematical word problem as a "text-to-text" problem, i.e. taking a text as input and producing a new text as output. However, the output text appears in the form of a mathematical expression. In our experiments, this paper evaluates a deep learning model of Ape210K, which consists of 210K Chinese primary school level maths problems. The experimental results show that our deep learning model presented in this paper can solve 78.61% of the mathematical word problems as a whole. In addition, the problems were classified into six categories based on the knowledge points involved, and this paper explores the model's performance on different types of problems.
引用
收藏
页码:818 / 822
页数:5
相关论文
共 50 条
  • [1] Text-to-Text Transfer Transformer Phrasing Model Using Enriched Text Input
    Rezackova, Marketa
    Matousek, Jindrich
    [J]. TEXT, SPEECH, AND DIALOGUE (TSD 2022), 2022, 13502 : 389 - 400
  • [2] Distractor Generation Through Text-to-Text Transformer Models
    de-Fitero-Dominguez, David
    Garcia-Lopez, Eva
    Garcia-Cabot, Antonio
    del-Hoyo-Gabaldon, Jesus-Angel
    Moreno-Cediel, Antonio
    [J]. IEEE ACCESS, 2024, 12 : 25580 - 25589
  • [3] Text-to-text generative approach for enhanced complex word identification
    Sliwiak, Patrycja
    Shah, Syed Afaq Ali
    [J]. NEUROCOMPUTING, 2024, 610
  • [4] TFix: Learning to Fix Coding Errors with a Text-to-Text Transformer
    Berabi, Berkay
    He, Jingxuan
    Raychev, Veselin
    Vechev, Martin
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [5] Exploring the limits of transfer learning with a unified text-to-text transformer
    Raffel, Colin
    Shazeer, Noam
    Roberts, Adam
    Lee, Katherine
    Narang, Sharan
    Matena, Michael
    Zhou, Yanqi
    Li, Wei
    Liu, Peter J.
    [J]. Journal of Machine Learning Research, 2020, 21
  • [6] UniTRec: A Unified Text-to-Text Transformer and Joint Contrastive Learning Framework for Text-based Recommendation
    Mao, Zhiming
    Wang, Huimin
    Du, Yiming
    Wong, Kam-Fai
    [J]. 61ST CONFERENCE OF THE THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS, ACL 2023, VOL 2, 2023, : 1160 - 1170
  • [7] Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer
    Raffel, Colin
    Shazeer, Noam
    Roberts, Adam
    Lee, Katherine
    Narang, Sharan
    Matena, Michael
    Zhou, Yanqi
    Li, Wei
    Liu, Peter J.
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21
  • [8] Keyword Extraction from Short Texts with a Text-to-Text Transfer Transformer
    Pezik, Piotr
    Mikolajczyk, Agnieszka
    Wawrzynski, Adam
    Niton, Bartlomiej
    Ogrodniczuk, Maciej
    [J]. RECENT CHALLENGES IN INTELLIGENT INFORMATION AND DATABASE SYSTEMS, ACIIDS 2022, 2022, 1716 : 530 - 542
  • [9] ViT5: Pretrained Text-to-Text Transformer for Vietnamese Language Generation
    Long Phan
    Hieu Tran
    Hieu Nguyen
    Trinh, Trieu H.
    [J]. NAACL 2022: THE 2022 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES: PROCEEDINGS OF THE STUDENT RESEARCH WORKSHOP, 2022, : 136 - 142
  • [10] Ensemble-NQG-T5: Ensemble Neural Question Generation Model Based on Text-to-Text Transfer Transformer
    Hwang, Myeong-Ha
    Shin, Jikang
    Seo, Hojin
    Im, Jeong-Seon
    Cho, Hee
    Lee, Chun-Kwon
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (02):