Item response theory models applied to data allowing examinee choice

被引:9
|
作者
Bradlow, ET
Thomas, N
机构
[1] Univ Penn, Wharton Sch Business, Philadelphia, PA 19104 USA
[2] Univ N Carolina, Dept Biostat, Chapel Hill, NC USA
关键词
examinee choice; item response theory; missing data;
D O I
10.2307/1165246
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Examinations that permit students to choose a subset of the items are popular despite the potential thar students may take examinations of varying difficulty us rr result of their choices. We provide a set of conditions for the validity of inference for Item Response theory (IRT) models applied to data collected from choice-bused examinations. Valid likelihood and Bayesian inference using standard estimation methods require (except in estraordinary circumstances) that there is no dependence, after conditioning on the observed item responses, between the examinees choices and their (potential but unobserved) responses ro omitted items, as well as their latent abilities. These independence assumptions are typical of those required in much more general settings. Common low-dimensional IRT models estimated by standard methods, though potentially useful tools for educational data, do not resolve the difficult problems posed by choice-based data.
引用
收藏
页码:236 / 243
页数:8
相关论文
共 50 条
  • [41] A note on monotonicity of item response functions for ordered polytomous item response theory models
    Kang, Hyeon-Ah
    Su, Ya-Hui
    Chang, Hua-Hua
    [J]. BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2018, 71 (03): : 523 - 535
  • [42] Factor Copula Models for Item Response Data
    Aristidis K. Nikoloulopoulos
    Harry Joe
    [J]. Psychometrika, 2015, 80 : 126 - 150
  • [43] Factor Copula Models for Item Response Data
    Nikoloulopoulos, Aristidis K.
    Joe, Harry
    [J]. PSYCHOMETRIKA, 2015, 80 (01) : 126 - 150
  • [44] Item Response Theory Models for Polytomous Multidimensional Forced-Choice Items to Measure Construct Differentiation
    Qiu, Xuelan
    de la Torre, Jimmy
    Wang, You-Gan
    Wu, Jinran
    [J]. EDUCATIONAL MEASUREMENT-ISSUES AND PRACTICE, 2024,
  • [45] A Bayesian nonparametric approach for handling item and examinee heterogeneity in assessment data
    Pan, Tianyu
    Shen, Weining
    Davis-Stober, Clintin P.
    Hu, Guanyu
    [J]. BRITISH JOURNAL OF MATHEMATICAL & STATISTICAL PSYCHOLOGY, 2024, 77 (01): : 196 - 211
  • [46] Response to: On item response theory models in allostatic load scoring
    Liu, Shelley H.
    Juster, Robert-Paul
    Dams-O'Connor, Kristen
    Spicer, Julie
    [J]. COMPREHENSIVE PSYCHONEUROENDOCRINOLOGY, 2021, 6
  • [47] The Impact of Ignoring Multilevel Data Structure on the Estimation of Dichotomous Item Response Theory Models
    Lee, Hyung Rock
    Lee, Sunbok
    Sung, Jaeyun
    [J]. INTERNATIONAL JOURNAL OF ASSESSMENT TOOLS IN EDUCATION, 2019, 6 (01): : 92 - 108
  • [48] Multidimensional item response theory models for testlet-based doubly bounded data
    Liu, Chen-Wei
    [J]. BEHAVIOR RESEARCH METHODS, 2024, 56 (06) : 5309 - 5353
  • [49] Posterior predictive assessment of item response theory models
    Sinharay, Sandip
    Johnson, Matthew S.
    Stern, Hal S.
    [J]. APPLIED PSYCHOLOGICAL MEASUREMENT, 2006, 30 (04) : 298 - 321
  • [50] Synthesizing the Ability in Multidimensional Item Response Theory Models
    Montenegro Diaz, Alvaro Mauricio
    Cepeda, Edilberto
    [J]. REVISTA COLOMBIANA DE ESTADISTICA, 2010, 33 (01): : 127 - 147