Contextual Embeddings and Graph Convolutional Networks for Concept Prerequisite Learning

被引:0
|
作者
Layoun, Jean-Charles [1 ]
Zouaq, Amal [1 ]
Desmarais, Michel [1 ]
机构
[1] Polytech Montreal, Montreal, PQ, Canada
关键词
Concept Prerequisite Relation; Large Language Models; Sentence Transformers; Graph Convolutional Networks;
D O I
10.1145/3605098.3636062
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Concept prerequisite learning (CPL) plays a crucial role in education. The objective of CPL is to predict prerequisite relations between different concepts. In this paper, we present a new approach for CPL using Sentence Transformers and Relational Graph Convolutional Networks (R-GCNs). This approach creates concept embeddings from single-sentence definitions extracted from Wikipedia using a Sentence Transformer. These embeddings are then used as an input feature matrix for the R-GCN, in addition to a graph structure that distinguishes prerequisites and non-prerequisites as distinct link types. Furthermore, the R-GCN is optimized simultaneously on CPL and concept domain classification to enhance prerequisite prediction generalization for unseen domains. Extensive experiments on the AL-CPL dataset show the effectiveness of our approach for the in-domain and cross-domain settings, as it outperforms the State-Of-The-Art (SOTA) methods on this dataset. Finally, we introduce a novel data split algorithm for this task to address a methodological issue found in previous studies. The new data split algorithm makes CPL more challenging to solve, but also more realistic as it excludes simple inferences by transitivity.
引用
收藏
页码:81 / 90
页数:10
相关论文
共 50 条
  • [1] Heterogeneous Graph Neural Networks for Concept Prerequisite Relation Learning in Educational Data
    Jia, Chenghao
    Shen, Yongliang
    Tang, Yechun
    Sun, Lu
    Lu, Weiming
    [J]. 2021 CONFERENCE OF THE NORTH AMERICAN CHAPTER OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS: HUMAN LANGUAGE TECHNOLOGIES (NAACL-HLT 2021), 2021, : 2036 - 2047
  • [2] Learning with Dual-graph for Concept Prerequisite Discovering
    Xu, Guolan
    Bai, Rujiang
    [J]. Data Analysis and Knowledge Discovery, 2024, 8 (05) : 38 - 45
  • [3] Contrastive Graph Learning with Graph Convolutional Networks
    Nagendar, G.
    Sitaram, Ramachandrula
    [J]. DOCUMENT ANALYSIS SYSTEMS, DAS 2022, 2022, 13237 : 96 - 110
  • [4] Continual Pre-Training of Language Models for Concept Prerequisite Learning with Graph Neural Networks
    Tang, Xin
    Liu, Kunjia
    Xu, Hao
    Xiao, Weidong
    Tan, Zhen
    [J]. MATHEMATICS, 2023, 11 (12)
  • [5] Towards Learning Generalizable Code Embeddings Using Task-agnostic Graph Convolutional Networks
    Ding, Zishuo
    Li, Heng
    Shang, Weiyi
    Chen, Tse-Hsun
    [J]. ACM TRANSACTIONS ON SOFTWARE ENGINEERING AND METHODOLOGY, 2023, 32 (02)
  • [6] Learning Connectivity with Graph Convolutional Networks
    Sahbi, Hichem
    [J]. 2020 25TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION (ICPR), 2021, : 9996 - 10003
  • [7] LEARNING CONVOLUTIONAL NEURAL NETWORKS WITH DEEP PART EMBEDDINGS
    Gupta, Nitin
    Mujumdar, Shashank
    Agarwal, Prerna
    Jain, Abhinav
    Mehta, Sameep
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2019, : 2037 - 2041
  • [8] Learning graph structure via graph convolutional networks
    Zhang, Qi
    Chang, Jianlong
    Meng, Gaofeng
    Xu, Shibiao
    Xiang, Shiming
    Pan, Chunhong
    [J]. PATTERN RECOGNITION, 2019, 95 : 308 - 318
  • [9] Knowledge graph embeddings for dealing with concept drift in machine learning
    Chen, Jiaoyan
    Lecue, Freddy
    Pan, Jeff Z.
    Deng, Shumin
    Chen, Huajun
    [J]. JOURNAL OF WEB SEMANTICS, 2021, 67
  • [10] Graph Convolutional Embeddings for Recommender Systems
    Duran, Paula G.
    Karatzoglou, Alexandros
    Vitria, Jordi
    Xin, Xin
    Arapakis, Ioannis
    [J]. IEEE ACCESS, 2021, 9 : 100173 - 100184