An Explanatory Item Response Theory Approach for a Computer-Based Case Simulation Test

被引:2
|
作者
Kahraman, Nilufer [1 ]
机构
[1] Baskent Univ, Ankara, Turkey
来源
EURASIAN JOURNAL OF EDUCATIONAL RESEARCH | 2014年 / 54期
关键词
Explanatory Item Response Theory; Partial Credit Model; Item Response Theory; Performance Tests; Item calibration; Ability estimation; Small tests;
D O I
10.14689/ejer.2014.54.7
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Problem: Practitioners working with multiple-choice tests have long utilized Item Response Theory (IRT) models to evaluate the performance of test items for quality assurance. The use of similar applications for performance tests, however, is often encumbered due to the challenges encountered in working with complicated data sets in which local calibrations alone provide a poor model fit. Purpose: The purpose of this study was to investigate whether the item calibration process for a performance test, computer-based case simulations (CCS), taken from the United States Medical Licensing Examination((R)) (USMLE (R)) Step 3((R)) examination may be improved through explanatory IRT models. It was hypothesized that explanatory IRT may help improve data modeling for performance assessment tests by allowing important predictors to be added to a conventional IRT model, which are limited to item predictors alone. Methods: The responses of 767 examinees from a six-item CCS test were modeled using the Partial Credit Model (PCM) and four explanatory model extensions, each incorporating one predictor variable of interest. Predictor variables were the examinees' gender, the order in which examinees encountered an individual item ( item sequence), the time it took each examinee to respond to each item ( response time), and examinees' ability score on the multiple-choice part of the examination. Results: Results demonstrate a superior model fit for the explanatory PCM with examinee ability score from the multiple-choice portion of Step 3. Explanatory IRT model extensions might prove useful in complex performance assessment test settings where item calibrations are often problematic due to short tests and small samples. Recommendations: Findings of this study have great value in practice and implications for researchers working with small or complicated response data. Explanatory IRT methodology not only provides a way to improve data modeling for performance assessment tests but also enhances the inferences made by allowing important person predictors to be incorporated into a conventional IRT model.
引用
收藏
页码:117 / 134
页数:18
相关论文
共 50 条
  • [21] Teaching color theory for automotive coatings: A computer-based approach
    Henry, M
    Killoran, M
    Monteleone, P
    Gromek, S
    COLOR RESEARCH AND APPLICATION, 2003, 28 (05): : 327 - 334
  • [22] Item Analysis of Test of Proficiency in Korean: Classical Test Theory and Item Response Theory
    Yu, Minae
    Kim, Hyunah
    KOREAN LANGUAGE IN AMERICA, 2019, 23 (01): : 1 - 26
  • [23] ON A COMPUTER-BASED THEORY OF STRATEGIES
    FINDLER, NV
    KYBERNETES, 1983, 12 (02) : 89 - 97
  • [25] Applying computer-based simulation to energy auditing: A case study
    Zhu, YM
    ENERGY AND BUILDINGS, 2006, 38 (05) : 421 - 428
  • [26] Item Response Theory-Based Continuous Test Norming
    Heister, Hannah M.
    Albers, Casper J.
    Wiberg, Marie
    Timmerman, Marieke E.
    PSYCHOLOGICAL METHODS, 2024,
  • [27] Investigating Subscores of VERA 3 German Test Based on Item Response Theory/Multidimensional Item Response Theory Models
    Temel, Gueler Yavuz
    Machunsky, Maya
    Rietz, Christian
    Okropiridze, Dimitry
    FRONTIERS IN EDUCATION, 2022, 7
  • [28] Investigating the Effect of Item Position in Computer-Based Tests
    Li, Feiming
    Cohen, Allan
    Shen, Linjun
    JOURNAL OF EDUCATIONAL MEASUREMENT, 2012, 49 (04) : 362 - 379
  • [29] ITEM BANKING IN COMPUTER-BASED INSTRUCTIONAL-SYSTEMS
    BAKER, FB
    APPLIED PSYCHOLOGICAL MEASUREMENT, 1986, 10 (04) : 405 - 414
  • [30] A MULTIMEDIA ITEM AUTHORING FRAMEWORK FOR COMPUTER-BASED EDUCATION
    Cheng, Irene
    Badalov, Alexey
    ICME: 2009 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOLS 1-3, 2009, : 1206 - 1209