UnrealMentor GPT: A System for Teaching Programming Based on a Large Language Model

被引:0
|
作者
Zhu, Hongli [1 ]
Xiang, Jian [2 ]
Yang, Zhichuang [2 ]
机构
[1] Hangzhou City Univ, Sch Informat & Elect Engn, Hangzhou, Peoples R China
[2] Zhejiang Univ Sci & Technol, Sch Informat & Elect Engn, Hangzhou, Peoples R China
关键词
Generative pretrain; large language model; programming teaching system;
D O I
10.1002/cae.70023
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
This paper introduces UnrealMentor GPT, a multiagent debugging framework that combines advanced large language model (LLM) capabilities with a dynamically updated knowledge base. Systems incorporating this framework are used in programming courses for university computer-related majors. This teaching system based on Generative Pre-training (GPT) technology guides students through a hierarchical learning process using multiple specialized agents (syntax checking, algorithm analysis, optimization) and retrieval-augmented generation (RAG). Experimental results based on the effectiveness of undergraduate courses show that students spend less time debugging code in the course, the accuracy of solutions is improved, and the overall learning efficiency is significantly enhanced. Subsequent surveys on teaching effectiveness also showed that students were satisfied with the learning process. Feedback from surveys of relevant teaching staff indicated that the system can simplify the error correction process and deepen students' understanding of concepts. However, there are some limitations to the current research, including the small sample size and short intervention time, which limits the application scenarios of the system. Future research will focus on expanding the participating groups, exploring cross-language applicability, and conducting longitudinal experiments to verify the effectiveness of UnrealMentor GPT in various educational environments.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Teaching programming language in grammar schools
    Hernyak, Zoltan
    Kiraly, Roland
    ANNALES MATHEMATICAE ET INFORMATICAE, 2009, 36 : 163 - 174
  • [42] A Java']Java-based environment for teaching programming language concepts
    Hauswirth, M
    Jazayeri, M
    Winzer, A
    28TH ANNUAL FRONTIERS IN EDUCATION CONFERENCE - CONFERENCE PROCEEDINGS, VOLS 1-3, 1998, : 296 - 300
  • [43] Discovery Model Based on Analogies for Teaching Computer Programming
    Jimenez Toledo, Javier Alejandro
    Collazos, Cesar A.
    Ortega, Manuel
    MATHEMATICS, 2021, 9 (12)
  • [44] SOCIOCULTURAL MODEL OF MULTICULTURAL LANGUAGE EDUCATION AND THE SYSTEM OF FOREIGN LANGUAGE TEACHING
    Galskova, Natalia D.
    Levchenko, Marina N.
    Shabanova, Veronika P.
    Skitina, Nina A.
    AD ALTA-JOURNAL OF INTERDISCIPLINARY RESEARCH, 2019, 9 (01): : 27 - 30
  • [45] PromptChainer: Chaining Large Language Model Prompts through Visual Programming
    Wu, Tongshuang
    Jiang, Ellen
    Donsbach, Aaron
    Gray, Jeff
    Molina, Alejandra
    Terry, Michael
    Cai, Carrie J.
    EXTENDED ABSTRACTS OF THE 2022 CHI CONFERENCE ON HUMAN FACTORS IN COMPUTING SYSTEMS, CHI 2022, 2022,
  • [46] An agent based programming language, learning support system
    Mungunsukh, H
    Cheng, ZX
    INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION, VOLS I AND II, PROCEEDINGS, 2002, : 148 - 152
  • [47] Parallel programming system based on Super Pascal language
    Katkov, SI
    Ruban, EY
    PARALLEL COMPUTING TECHNOLOGIES, 1999, 1662 : 481 - 482
  • [48] PTL: A model transformation language based on logic programming
    Almendros-Jimenez, Jesus M.
    Iribarne, Luis
    Lopez-Fernandez, Jesus
    Mora-Segura, Angel
    JOURNAL OF LOGICAL AND ALGEBRAIC METHODS IN PROGRAMMING, 2016, 85 (02) : 332 - 366
  • [49] A programming language interpreter system based on equivalent transformation
    Koike, H
    Akama, K
    Mabuchi, H
    INES 2005: 9TH INTERNATIONAL CONFERENCE ON INTELLIGENT ENGINEERING SYSTEMS, 2005, : 283 - 288
  • [50] An Application Programming Interface (API) Sensitive Data Identification Method Based on the Federated Large Language Model
    Wu, Jianping
    Chen, Lifeng
    Fang, Siyuan
    Wu, Chunming
    APPLIED SCIENCES-BASEL, 2024, 14 (22):