UnrealMentor GPT: A System for Teaching Programming Based on a Large Language Model

被引:0
|
作者
Zhu, Hongli [1 ]
Xiang, Jian [2 ]
Yang, Zhichuang [2 ]
机构
[1] Hangzhou City Univ, Sch Informat & Elect Engn, Hangzhou, Peoples R China
[2] Zhejiang Univ Sci & Technol, Sch Informat & Elect Engn, Hangzhou, Peoples R China
关键词
Generative pretrain; large language model; programming teaching system;
D O I
10.1002/cae.70023
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
This paper introduces UnrealMentor GPT, a multiagent debugging framework that combines advanced large language model (LLM) capabilities with a dynamically updated knowledge base. Systems incorporating this framework are used in programming courses for university computer-related majors. This teaching system based on Generative Pre-training (GPT) technology guides students through a hierarchical learning process using multiple specialized agents (syntax checking, algorithm analysis, optimization) and retrieval-augmented generation (RAG). Experimental results based on the effectiveness of undergraduate courses show that students spend less time debugging code in the course, the accuracy of solutions is improved, and the overall learning efficiency is significantly enhanced. Subsequent surveys on teaching effectiveness also showed that students were satisfied with the learning process. Feedback from surveys of relevant teaching staff indicated that the system can simplify the error correction process and deepen students' understanding of concepts. However, there are some limitations to the current research, including the small sample size and short intervention time, which limits the application scenarios of the system. Future research will focus on expanding the participating groups, exploring cross-language applicability, and conducting longitudinal experiments to verify the effectiveness of UnrealMentor GPT in various educational environments.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] Robustness of GPT Large Language Models on Natural Language Processing Tasks
    Xuanting C.
    Junjie Y.
    Can Z.
    Nuo X.
    Tao G.
    Qi Z.
    Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2024, 61 (05): : 1128 - 1142
  • [22] GPT4MTS: Prompt-Based Large Language Model for Multimodal Time-Series Forecasting
    Jia, Furong
    Wang, Kevin
    Zheng, Yixiang
    Cao, Defu
    Liu, Yan
    THIRTY-EIGTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 21, 2024, : 23343 - 23351
  • [23] Teaching Reform of Assembly Language Programming Based on Constructionism Theory
    Zhao Yang
    Yin Li-jie
    Shan Juan
    2011 SECOND INTERNATIONAL CONFERENCE ON EDUCATION AND SPORTS EDUCATION (ESE 2011), VOL V, 2011, : 365 - 368
  • [24] NCRE Based Exploration into C Language Programming Teaching on Moodle
    Lu, Chunhua
    2ND INTERNATIONAL CONFERENCE ON INFORMATION TECHNOLOGY AND INDUSTRIAL AUTOMATION (ICITIA 2017), 2017, : 176 - 183
  • [25] The Improved Methods of Teaching Practice Based on C Language Programming
    Liu Shukun
    Chen Zhen
    Tang Jinpeng
    PROCEEDINGS OF THE 2013 CONFERENCE ON EDUCATION TECHNOLOGY AND MANAGEMENT SCIENCE (ICETMS 2013), 2013, : 1365 - 1368
  • [26] Exploring a GPT-based large language model for variable autonomy in a VR-based human-robot teaming simulation
    Lakhnati, Younes
    Pascher, Max
    Gerken, Jens
    FRONTIERS IN ROBOTICS AND AI, 2024, 11
  • [27] A Model Transformation Language Based on Logic Programming
    Almendros-Jimenez, Jesus M.
    Iribarne, Luis
    SOFSEM 2013: THEORY AND PRACTICE OF COMPUTER SCIENCE, 2013, 7741 : 382 - 394
  • [28] A LOGIC PROGRAMMING LANGUAGE BASED ON THE ANDORRA MODEL
    HARIDI, S
    NEW GENERATION COMPUTING, 1990, 7 (2-3) : 109 - 125
  • [29] TREE-GPT: MODULAR LARGE LANGUAGE MODEL EXPERT SYSTEM FOR FOREST REMOTE SENSING IMAGE UNDERSTANDING AND INTERACTIVE ANALYSIS
    Du, S. Q.
    Tang, S. J.
    Wang, W. X.
    Li, X. M.
    Guo, R. Z.
    GEOSPATIAL WEEK 2023, VOL. 48-1, 2023, : 1729 - 1736
  • [30] Developing a GPT Chatbot Model for Students Programming Education
    Modran, Horia Alexandru
    Ursutiu, Doru
    Samoila, Cornel
    Gherman-Dolhascu, Elena-Catalina
    SMART TECHNOLOGIES FOR A SUSTAINABLE FUTURE, VOL 2, STE 2024, 2024, 1028 : 72 - 82