Knowledge structure enhanced graph representation learning model for attentive knowledge tracing

被引:25
|
作者
Gan, Wenbin [1 ]
Sun, Yuan [1 ]
Sun, Yi [2 ]
机构
[1] Sokendai, Natl Inst Informat, Tokyo, Japan
[2] Univ Chinese Acad Sci, Sch Comp Sci & Technol, Beijing, Peoples R China
基金
日本学术振兴会;
关键词
cognitive question difficulty; graph representation learning; intelligent tutoring systems; knowledge structure discovery; knowledge tracing; learner proficiency estimation;
D O I
10.1002/int.22763
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Knowledge tracing (KT) is a fundamental personalized-tutoring technique for learners in online learning systems. Recent KT methods employ flexible deep neural network-based models that excel at this task. However, the adequacy of KT is still challenged by the sparseness of the learners' exercise data. To alleviate the sparseness problem, most of the exiting KT studies are performed at the skill-level rather than the question-level, as questions are often numerous and associated with much fewer skills. However, at the skill level, KT neglects the distinctive information related to the questions themselves and their relations. In this case, the models can imprecisely infer the learners' knowledge states and might fail to capture the long-term dependencies in the exercising sequences. In the knowledge domain, skills are naturally linked as a graph (with the edges being the prerequisite relations between pedagogical concepts). We refer to such a graph as a knowledge structure (KS). Incorporating a KS into the KT procedure can potentially resolve both the sparseness and information loss, but this avenue has been underexplored because obtaining the complete KS of a domain is challenging and labor-intensive. In this paper, we propose a novel KS-enhanced graph representation learning model for KT with an attention mechanism (KSGKT). We first explore eight methods that automatically infer the domain KS from learner response data and integrate it into the KT procedure. Leveraging a graph representation learning model, we then obtain the question and skill embeddings from the KS-enhanced graph. To incorporate more distinctive information on the questions, we extract the cognitive question difficulty from the learning history of each learner. We then propose a convolutional representation method that fuses these disctinctive features, thus obtaining a comprehensive representation of each question. These representations are input to the proposed KT model, and the long-term dependencies are handled by the attention mechanism. The model finally predicts the learner's performance on new problems. Extensive experiments conducted from six perspectives on three real-world data sets demonstrated the superiority and interpretability of our model for learner-performance modeling. Based on the KT results, we also suggest three potential applications of our model.
引用
收藏
页码:2012 / 2045
页数:34
相关论文
共 50 条
  • [1] GASKT: A Graph-Based Attentive Knowledge-Search Model for Knowledge Tracing
    Wang, Mengdan
    Peng, Chao
    Yang, Rui
    Wang, Chenchao
    Chen, Yao
    Yu, Xiaohua
    [J]. KNOWLEDGE SCIENCE, ENGINEERING AND MANAGEMENT, PT I, 2021, 12815 : 268 - 279
  • [2] Deep Attentive Model for Knowledge Tracing
    Wang, Xinping
    Chen, Liangyu
    Zhang, Min
    [J]. THIRTY-SEVENTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 37 NO 8, 2023, : 10192 - 10199
  • [3] Text-enhanced knowledge graph representation learning with local structure
    Li, Zhifei
    Jian, Yue
    Xue, Zengcan
    Zheng, Yumin
    Zhang, Miao
    Zhang, Yan
    Hou, Xiaoju
    Wang, Xiaoguang
    [J]. INFORMATION PROCESSING & MANAGEMENT, 2024, 61 (05)
  • [4] Type-Enhanced Temporal Knowledge Graph Representation Learning Model
    He, Peng
    Zhou, Gang
    Chen, Jing
    Zhang, Mengli
    Ning, Yuanlong
    [J]. Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2023, 60 (04): : 916 - 929
  • [5] Text-Graph Enhanced Knowledge Graph Representation Learning
    Hu, Linmei
    Zhang, Mengmei
    Li, Shaohua
    Shi, Jinghan
    Shi, Chuan
    Yang, Cheng
    Liu, Zhiyuan
    [J]. FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2021, 4
  • [6] A Model of Text-Enhanced Knowledge Graph Representation Learning With Mutual Attention
    Wang, Yashen
    Zhang, Huanhuan
    Shi, Ge
    Liu, Zhirun
    Zhou, Qiang
    [J]. IEEE ACCESS, 2020, 8 : 52895 - 52905
  • [7] A Model of Text-Enhanced Knowledge Graph Representation Learning with Collaborative Attention
    Wang, Yashen
    Zhang, Huanhuan
    Xie, Haiyong
    [J]. ASIAN CONFERENCE ON MACHINE LEARNING, VOL 101, 2019, 101 : 236 - 251
  • [8] Sequential Self-Attentive Model for Knowledge Tracing
    Zhang, Xuelong
    Zhang, Juntao
    Lin, Nanzhou
    Yang, Xiandi
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING - ICANN 2021, PT I, 2021, 12891 : 318 - 330
  • [9] A lightweight hierarchical graph convolutional model for knowledge graph representation learning
    Zhang, Jinglin
    Shen, Bo
    [J]. APPLIED INTELLIGENCE, 2024, 54 (21) : 10695 - 10708
  • [10] Temporal enhanced inductive graph knowledge tracing
    Donghee Han
    Daehee Kim
    Minsu Kim
    Keejun Han
    Mun Yong Yi
    [J]. Applied Intelligence, 2023, 53 : 29282 - 29299