Deep Knowledge Tracing Model with an Evolved Transformer Structure

被引:0
|
作者
Li, Zhijun [1 ]
Xue, Zixiao [1 ]
Liu, Chen [1 ]
Feng, Yanzhang [1 ]
机构
[1] North China Univ Technol, Sch Elect & Control Engn, Beijing 100144, Peoples R China
关键词
Deep knowledge tracing; Transformer; Hybrid attention mechanism; Interpretability;
D O I
10.1109/DDCLS58216.2023.10167354
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep Learning based Knowledge Tracing (DKT) has become a research hotspot in the intelligence education field. Compared to conventional methods, DKT has better predictive performance, but it also has some problems such as poor interpretability, and difficulties in reflecting the causal association between the learning process and test results. In this paper, a new DKT model is proposed based on an evolved Transformer structure (DKT-ETS). The encoder layer is composed of three coding networks with a multi-head self-attention mechanism, while inputs are three types of pre-processed data: process characteristic data, test label data, and answer results data. Output as three matrices of V, Q, K. The decoder layer also uses the attention mechanism, in which the input is the three matrices that come from encoder, and the output is the predicted result. By improving the structure, the new model introduces certain interpretability into the V, Q and K matrices of the attention mechanism. Thus, the causal relationship between the learning process and test results can be reflected a certain extent: the V matrix represents the characteristic information of the testee's learning process; the Q matrix reflects the knowledge point information examined by the current test item; and the K matrix represents the results of the previous tests. DKT-ETS was validated by using the large-scale knowledge tracking data set EdNet, and the results show that its ACC and AUC evaluation indicators have been significantly improved.
引用
收藏
页码:1586 / 1592
页数:7
相关论文
共 50 条
  • [31] Deep Knowledge Tracing with Side Information
    Wang, Zhiwei
    Feng, Xiaoqin
    Tang, Jiliang
    Huang, Gale Yan
    Liu, Zitao
    ARTIFICIAL INTELLIGENCE IN EDUCATION, AIED 2019, PT II, 2019, 11626 : 303 - 308
  • [32] Towards more accurate and interpretable model: Fusing multiple knowledge relations into deep knowledge tracing
    Duan, Zhiyi
    Dong, Xiaoxiao
    Gu, Hengnian
    Wu, Xiong
    Li, Zhen
    Zhou, Dongdai
    EXPERT SYSTEMS WITH APPLICATIONS, 2024, 243
  • [33] ENHANCING DEEP KNOWLEDGE TRACING (DKT) MODEL BY INTRODUCING EXTRA STUDENT ATTRIBUTES
    Suragani, Girish
    Pothuraju, Lakshmi Narayana
    Reddi, Kamal Sandeep
    Godfrey, W. Wilfred
    2019 IEEE 5TH INTERNATIONAL CONFERENCE FOR CONVERGENCE IN TECHNOLOGY (I2CT), 2019,
  • [34] Deep Knowledge Tracing is an Implicit Dynamic Multidimensional Item Response Theory Model
    Vie, Jill-Jenn
    Kashima, Hisashi
    31ST INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION, ICCE 2023, VOL I, 2023, : 46 - 54
  • [35] A survey on deep learning based knowledge tracing
    Song, Xiangyu
    Li, Jianxin
    Cai, Taotao
    Yang, Shuiqiao
    Yang, Tingting
    Liu, Chengfei
    KNOWLEDGE-BASED SYSTEMS, 2022, 258
  • [36] Incorporating Rich Features into Deep Knowledge Tracing
    Zhang, Liang
    Xiong, Xiaolu
    Zhao, Siyuan
    Botelho, Anthony
    Heffernan, Neil T.
    PROCEEDINGS OF THE FOURTH (2017) ACM CONFERENCE ON LEARNING @ SCALE (L@S'17), 2017, : 169 - 172
  • [37] GameDKT: Deep knowledge tracing in educational games
    Hooshyar, Danial
    Huang, Yueh-Min
    Yang, Yeongwook
    EXPERT SYSTEMS WITH APPLICATIONS, 2022, 196
  • [38] Deep Performance Factors Analysis for Knowledge Tracing
    Pu, Shi
    Converse, Geoffrey
    Huang, Yuchi
    ARTIFICIAL INTELLIGENCE IN EDUCATION (AIED 2021), PT I, 2021, 12748 : 331 - 341
  • [39] Heterogeneous Features Integration in Deep Knowledge Tracing
    Cheung, Lap Pong
    Yang, Haiqin
    NEURAL INFORMATION PROCESSING (ICONIP 2017), PT II, 2017, 10635 : 653 - 662
  • [40] Interpreting Deep Learning Models for Knowledge Tracing
    Yu Lu
    Deliang Wang
    Penghe Chen
    Qinggang Meng
    Shengquan Yu
    International Journal of Artificial Intelligence in Education, 2023, 33 : 519 - 542