Group-Level Cognitive Diagnosis: A Multi-Task Learning Perspective

被引:8
|
作者
Huang, Jie [1 ]
Liu, Qi [1 ]
Wang, Fei [1 ]
Huang, Zhenya [1 ]
Fang, Songtao [1 ]
Wu, Runze [2 ]
Chen, Enhong [1 ]
Su, Yu [1 ,3 ]
Wang, Shijin [3 ]
机构
[1] Univ Sci & Technol China, Anhui Prov Key Lab Big Data Anal & Applicat Sch C, Hefei, Anhui, Peoples R China
[2] NetEase Inc, Fuxi AI Lab, Hangzhou, Peoples R China
[3] IFLYTEK Res, Hefei, Anhui, Peoples R China
来源
2021 21ST IEEE INTERNATIONAL CONFERENCE ON DATA MINING (ICDM 2021) | 2021年
基金
中国国家自然科学基金;
关键词
Group-Level Cognitive Diagnosis; Multi-Task Learning; Attention Mechanism; Data Sparsity; DINA MODEL;
D O I
10.1109/ICDM51629.2021.00031
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Most cognitive diagnosis research in education has been concentrated on individual assessment, aiming at discovering the latent characteristics of students. However, in many real-world scenarios, group-level assessment is an important and meaningful task, e.g., class assessment in different regions can discover the difference of teaching level in different contexts. In this work, we consider assessing cognitive ability for a group of students, which aims to mine groups' proficiency on specific knowledge concepts. The significant challenge in this task is the sparsity of group-exercise response data, which seriously affects the assessment performance. Existing works either do not make effective use of additional student-exercise response data or fail to reasonably model the relationship between group ability and individual ability in different learning contexts, resulting in sub-optimal diagnosis results. To this end, we propose a general Multi-Task based Group-Level Cognitive Diagnosis (MGCD) framework, which is featured with three special designs: 1) We jointly model student-exercise responses and group-exercise responses in a multi-task manner to alleviate the sparsity of group-exercise responses; 2) We design a context-aware attention network to model the relationship between student knowledge state and group knowledge state in different contexts; 3) We model an interpretable cognitive layer to obtain student ability, group ability and exercise factors (e.g., difficulty), and then we leverage neural networks to learn complex interaction functions among them. Extensive experiments on real-world datasets demonstrate the generality of MGCD and the effectiveness of our attention design and multi-task learning.
引用
收藏
页码:210 / 219
页数:10
相关论文
共 50 条
  • [41] Survey of Multi-Task Learning
    Zhang Y.
    Liu J.-W.
    Zuo X.
    1600, Science Press (43): : 1340 - 1378
  • [42] A Survey on Multi-Task Learning
    Zhang, Yu
    Yang, Qiang
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2022, 34 (12) : 5586 - 5609
  • [43] MNCM: Multi-level Network Cascades Model for Multi-Task Learning
    Wu, Haotian
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 4565 - 4569
  • [44] Deep multi-level networks with multi-task learning for saliency detection
    Zhang, Lihe
    Fang, Xiang
    Bo, Hongguang
    Wang, Tiantian
    Lu, Huchuan
    NEUROCOMPUTING, 2018, 312 : 229 - 238
  • [45] Multi-view Multi-task Learning for Improving Autonomous Mammogram Diagnosis
    Kyono, Trent
    Gilbert, Fiona J.
    van der Schaar, Mihaela
    MACHINE LEARNING FOR HEALTHCARE CONFERENCE, VOL 106, 2019, 106
  • [46] Multi-fault diagnosis for gearboxes based on multi-task deep learning
    Zhao X.
    Wu J.
    Qian C.
    Zhang Y.
    Wang L.
    Zhendong yu Chongji/Journal of Vibration and Shock, 2019, 38 (23): : 271 - 278
  • [47] Multi-task fused sparse learning for mild cognitive impairment identification
    Yang, Peng
    Ni, Dong
    Chen, Siping
    Wang, Tianfu
    Wu, Donghui
    Lei, Baiying
    TECHNOLOGY AND HEALTH CARE, 2018, 26 : S437 - S448
  • [48] MULTI-TASK DISTILLATION: TOWARDS MITIGATING THE NEGATIVE TRANSFER IN MULTI-TASK LEARNING
    Meng, Ze
    Yao, Xin
    Sun, Lifeng
    2021 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2021, : 389 - 393
  • [49] Task Variance Regularized Multi-Task Learning
    Mao, Yuren
    Wang, Zekai
    Liu, Weiwei
    Lin, Xuemin
    Hu, Wenbin
    IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (08) : 8615 - 8629
  • [50] Task Switching Network for Multi-task Learning
    Sun, Guolei
    Probst, Thomas
    Paudel, Danda Pani
    Popovic, Nikola
    Kanakis, Menelaos
    Patel, Jagruti
    Dai, Dengxin
    Van Gool, Luc
    2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 8271 - 8280