Deep Knowledge Tracing Model with an Evolved Transformer Structure

被引:0
|
作者
Li, Zhijun [1 ]
Xue, Zixiao [1 ]
Liu, Chen [1 ]
Feng, Yanzhang [1 ]
机构
[1] North China Univ Technol, Sch Elect & Control Engn, Beijing 100144, Peoples R China
关键词
Deep knowledge tracing; Transformer; Hybrid attention mechanism; Interpretability;
D O I
10.1109/DDCLS58216.2023.10167354
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Deep Learning based Knowledge Tracing (DKT) has become a research hotspot in the intelligence education field. Compared to conventional methods, DKT has better predictive performance, but it also has some problems such as poor interpretability, and difficulties in reflecting the causal association between the learning process and test results. In this paper, a new DKT model is proposed based on an evolved Transformer structure (DKT-ETS). The encoder layer is composed of three coding networks with a multi-head self-attention mechanism, while inputs are three types of pre-processed data: process characteristic data, test label data, and answer results data. Output as three matrices of V, Q, K. The decoder layer also uses the attention mechanism, in which the input is the three matrices that come from encoder, and the output is the predicted result. By improving the structure, the new model introduces certain interpretability into the V, Q and K matrices of the attention mechanism. Thus, the causal relationship between the learning process and test results can be reflected a certain extent: the V matrix represents the characteristic information of the testee's learning process; the Q matrix reflects the knowledge point information examined by the current test item; and the K matrix represents the results of the previous tests. DKT-ETS was validated by using the large-scale knowledge tracking data set EdNet, and the results show that its ACC and AUC evaluation indicators have been significantly improved.
引用
收藏
页码:1586 / 1592
页数:7
相关论文
共 50 条
  • [41] A Genetic Causal Explainer for Deep Knowledge Tracing
    Li, Qing
    Yuan, Xin
    Liu, Sannyuya
    Gao, Lu
    Wei, Tianyu
    Shen, Xiaoxuan
    Sun, Jianwen
    IEEE TRANSACTIONS ON EVOLUTIONARY COMPUTATION, 2024, 28 (04) : 861 - 875
  • [42] Deep Knowledge Tracing on Skills with Small Datasets
    Tato, Ange
    Nkambou, Roger
    INTELLIGENT TUTORING SYSTEMS, ITS 2022, 2022, 13284 : 123 - 135
  • [43] Prerequisite-Driven Deep Knowledge Tracing
    Chen, Penghe
    Lu, Yu
    Zheng, Vincent W.
    Pian, Yang
    2018 IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM), 2018, : 39 - 48
  • [44] Interpreting Deep Learning Models for Knowledge Tracing
    Lu, Yu
    Wang, Deliang
    Chen, Penghe
    Meng, Qinggang
    Yu, Shengquan
    INTERNATIONAL JOURNAL OF ARTIFICIAL INTELLIGENCE IN EDUCATION, 2023, 33 (03) : 519 - 542
  • [45] Variational Deep Knowledge Tracing for Language Learning
    Ruan, Sherry
    Wei, Wei
    Landay, James
    LAK21 CONFERENCE PROCEEDINGS: THE ELEVENTH INTERNATIONAL CONFERENCE ON LEARNING ANALYTICS & KNOWLEDGE, 2021, : 323 - 332
  • [46] An Enhanced Deep Knowledge Tracing Model via Multiband Attention and Quantized Question Embedding
    Xu, Jiazhen
    Hu, Wanting
    APPLIED SCIENCES-BASEL, 2024, 14 (08):
  • [47] Knowledge ontology enhanced model for explainable knowledge tracing
    Wang, Yao
    Huo, Yujia
    Yang, Changxiao
    Huang, Xingchen
    Xia, Dawen
    Feng, Fujian
    JOURNAL OF KING SAUD UNIVERSITY-COMPUTER AND INFORMATION SCIENCES, 2024, 36 (05)
  • [48] Prior Knowledge on the Dynamics of Skill Acquisition Improves Deep Knowledge Tracing
    Pan, Qiushi
    Tezuka, Taro
    29TH INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION (ICCE 2021), VOL I, 2021, : 201 - 211
  • [49] GELT: A graph embeddings based lite-transformer for knowledge tracing
    Liang, Zhijie
    Wu, Ruixia
    Liang, Zhao
    Yang, Juan
    Wang, Ling
    Su, Jianyu
    PLOS ONE, 2024, 19 (05):
  • [50] Review and Performance Comparison of Deep Knowledge Tracing Models
    Wang Y.
    Zhu M.-X.
    Yang S.-H.
    Lu X.-S.
    Zhou A.-Y.
    Ruan Jian Xue Bao/Journal of Software, 2023, 34 (03): : 1365 - 1395