Deep Knowledge Tracing Based on Spatial and Temporal Representation Learning for Learning Performance Prediction

被引:12
|
作者
Lyu, Liting [1 ]
Wang, Zhifeng [2 ]
Yun, Haihong [1 ]
Yang, Zexue [1 ]
Li, Ya [1 ]
机构
[1] Heilongjiang Inst Technol, Sch Comp Sci & Technol, Harbin 150050, Peoples R China
[2] Cent China Normal Univ, Sch Educ Informat Technol, Wuhan 430079, Peoples R China
来源
APPLIED SCIENCES-BASEL | 2022年 / 12卷 / 14期
基金
中国国家自然科学基金;
关键词
prediction; learning performance; e-learning; deep learning; knowledge tracing; knowledge representation; spatial feature; temporal feature; convolutional neural network; bidirectional long short-term memory; MODEL;
D O I
10.3390/app12147188
中图分类号
O6 [化学];
学科分类号
0703 ;
摘要
Knowledge tracing (KT) serves as a primary part of intelligent education systems. Most current KTs either rely on expert judgments or only exploit a single network structure, which affects the full expression of learning features. To adequately mine features of students' learning process, Deep Knowledge Tracing Based on Spatial and Temporal Deep Representation Learning for Learning Performance Prediction (DKT-STDRL) is proposed in this paper. DKT-STDRL extracts spatial features from students' learning history sequence, and then further extracts temporal features to extract deeper hidden information. Specifically, firstly, the DKT-STDRL model uses CNN to extract the spatial feature information of students' exercise sequences. Then, the spatial features are connected with the original students' exercise features as joint learning features. Then, the joint features are input into the BiLSTM part. Finally, the BiLSTM part extracts the temporal features from the joint learning features to obtain the prediction information of whether the students answer correctly at the next time step. Experiments on the public education datasets ASSISTment2009, ASSISTment2015, Synthetic-5, ASSISTchall, and Statics2011 prove that DKT-STDRL can achieve better prediction effects than DKT and CKT.
引用
收藏
页数:21
相关论文
共 50 条
  • [2] Multi-type factors representation learning for deep learning-based knowledge tracing
    Liangliang He
    Jintao Tang
    Xiao Li
    Pancheng Wang
    Feng Chen
    Ting Wang
    [J]. World Wide Web, 2022, 25 : 1343 - 1372
  • [3] Multi-type factors representation learning for deep learning-based knowledge tracing
    He, Liangliang
    Tang, Jintao
    Li, Xiao
    Wang, Pancheng
    Chen, Feng
    Wang, Ting
    [J]. WORLD WIDE WEB-INTERNET AND WEB INFORMATION SYSTEMS, 2022, 25 (03): : 1343 - 1372
  • [4] A survey on deep learning based knowledge tracing
    Song, Xiangyu
    Li, Jianxin
    Cai, Taotao
    Yang, Shuiqiao
    Yang, Tingting
    Liu, Chengfei
    [J]. KNOWLEDGE-BASED SYSTEMS, 2022, 258
  • [5] Research Advances in the Knowledge Tracing Based on Deep Learning
    Liu, Tieyuan
    Chen, Wei
    Chang, Liang
    Gu, Tianlong
    [J]. Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2022, 59 (01): : 81 - 104
  • [6] A temporal and spatial prediction method for urban pipeline network based on deep learning
    Liao, Ziyi
    Liu, Minghui
    Du, Bowen
    Zhou, Haijun
    Li, Linchao
    [J]. PHYSICA A-STATISTICAL MECHANICS AND ITS APPLICATIONS, 2022, 608
  • [7] Deep Knowledge Tracing with Learning Curves
    Yang, Shanghui
    Liu, Xin
    Su, Hang
    Zhu, Mengxia
    Lu, Xuesong
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON DATA MINING WORKSHOPS, ICDMW, 2022, : 282 - 291
  • [8] Deep knowledge tracing with learning curves
    Su, Hang
    Liu, Xin
    Yang, Shanghui
    Lu, Xuesong
    [J]. FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [9] Dynamic Spatial-Temporal Representation Learning for Traffic Flow Prediction
    Liu, Lingbo
    Zhen, Jiajie
    Li, Guanbin
    Zhan, Geng
    He, Zhaocheng
    Du, Bowen
    Lin, Liang
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2021, 22 (11) : 7169 - 7183
  • [10] A miRNA Target Prediction Model Based on Distributed Representation Learning and Deep Learning
    Sun, Yuzhuo
    Xiong, Fei
    Sun, Yongke
    Zhao, Youjie
    Cao, Yong
    [J]. COMPUTATIONAL AND MATHEMATICAL METHODS IN MEDICINE, 2022, 2022