Item Response Ranking for Cognitive Diagnosis

被引:0
|
作者
Tong, Shiwei [1 ,2 ]
Liu, Qi [1 ,2 ]
Yu, Runlong [1 ,2 ]
Huang, Wei [1 ,2 ]
Huang, Zhenya [1 ,2 ]
Pardos, Zachary A. [3 ]
Jiang, Weijie [3 ]
机构
[1] Univ Sci & Technol China, Sch Comp Sci & Technol, Anhui Prov Key Lab Big Data Anal & Applicat, Langfang, Peoples R China
[2] Univ Sci & Technol China, Sch Data Sci, Langfang, Peoples R China
[3] Univ Calif Berkeley, Berkeley, CA USA
基金
中国国家自然科学基金;
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Cognitive diagnosis, a fundamental task in education area, aims at providing an approach to reveal the proficiency level of students on knowledge concepts. Actually, monotonicity is one of the basic conditions in cognitive diagnosis theory, which assumes that student's proficiency is monotonic with the probability of giving the right response to a test item. However, few of previous methods consider the monotonicity during optimization. To this end, we propose Item Response Ranking framework (IRR), aiming at introducing pairwise learning into cognitive diagnosis to well model the monotonicity between item responses. Specifically, we first use an item specific sampling method to sample item responses and construct response pairs based on their partial order, where we propose the two-branch sampling methods to handle the unobserved responses. After that, we use a pairwise objective function to exploit the monotonicity in the pair formulation. In fact, IRR is a general framework which can be applied to most of contemporary cognitive diagnosis models. Extensive experiments demonstrate the effectiveness and interpretability of our method.
引用
收藏
页码:1750 / 1756
页数:7
相关论文
共 50 条
  • [31] Ranking Item Features by Mining Online User-Item Interactions
    Abbar, Sofiane
    Rahman, Habibur
    Thirumuruganathan, Saravanan
    Castillo, Carlos
    Das, Gautam
    2014 IEEE 30TH INTERNATIONAL CONFERENCE ON DATA ENGINEERING (ICDE), 2014, : 460 - 471
  • [32] Understanding examinees’ item responses through cognitive modeling of response accuracy and response times
    Susan Embretson
    Large-scale Assessments in Education, 11
  • [33] Understanding examinees' item responses through cognitive modeling of response accuracy and response times
    Embretson, Susan
    LARGE-SCALE ASSESSMENTS IN EDUCATION, 2023, 11 (01)
  • [34] A Cognitive Diagnosis Model for Continuous Response
    Minchen, Nathan D.
    de la Torre, Jimmy
    Liu, Ying
    JOURNAL OF EDUCATIONAL AND BEHAVIORAL STATISTICS, 2017, 42 (06) : 651 - 677
  • [35] Quantitatively ranking incorrect responses to multiple-choice questions using item response theory
    Smith, Trevor, I
    Louis, Kyle J.
    Ricci, Bartholomew J.
    Bendjilali, Nasrine
    PHYSICAL REVIEW PHYSICS EDUCATION RESEARCH, 2020, 16 (01):
  • [36] A Grouped Ranking Model for Item Preference Parameter
    Hino, Hideitsu
    Fujimoto, Yu
    Murata, Noboru
    NEURAL COMPUTATION, 2010, 22 (09) : 2417 - 2451
  • [37] Cross Pairwise Ranking for Unbiased Item Recommendation
    Wan, Qi
    He, Xiangnan
    Wang, Xiang
    Wu, Jiancan
    Guo, Wei
    Tang, Ruiming
    PROCEEDINGS OF THE ACM WEB CONFERENCE 2022 (WWW'22), 2022, : 2370 - 2378
  • [38] Sensitivity of the Informant Questionnaire on Cognitive Decline: An application of item response theory
    Butt, Zeeshan
    AGING NEUROPSYCHOLOGY AND COGNITION, 2008, 15 (05) : 642 - 655
  • [39] Dissecting the expanded cognitive reflection test: an item response theory analysis
    Srol, Jakub
    JOURNAL OF COGNITIVE PSYCHOLOGY, 2018, 30 (07) : 643 - 655
  • [40] Cognitive Reserve Questionnaire: psychometric analysis from the item response theory
    Martino, Pablo
    Caycho-Rodriguez, Tomas
    Valencia, Pablo D.
    Politis, Daniel
    Gallegos, Miguel
    De Bortoli, Miguel A.
    Cervigni, Mauricio
    REVISTA DE NEUROLOGIA, 2022, 75 (07) : 173 - 180