An Explanatory Item Response Theory Approach for a Computer-Based Case Simulation Test

被引:2
|
作者
Kahraman, Nilufer [1 ]
机构
[1] Baskent Univ, Ankara, Turkey
来源
EURASIAN JOURNAL OF EDUCATIONAL RESEARCH | 2014年 / 54期
关键词
Explanatory Item Response Theory; Partial Credit Model; Item Response Theory; Performance Tests; Item calibration; Ability estimation; Small tests;
D O I
10.14689/ejer.2014.54.7
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Problem: Practitioners working with multiple-choice tests have long utilized Item Response Theory (IRT) models to evaluate the performance of test items for quality assurance. The use of similar applications for performance tests, however, is often encumbered due to the challenges encountered in working with complicated data sets in which local calibrations alone provide a poor model fit. Purpose: The purpose of this study was to investigate whether the item calibration process for a performance test, computer-based case simulations (CCS), taken from the United States Medical Licensing Examination((R)) (USMLE (R)) Step 3((R)) examination may be improved through explanatory IRT models. It was hypothesized that explanatory IRT may help improve data modeling for performance assessment tests by allowing important predictors to be added to a conventional IRT model, which are limited to item predictors alone. Methods: The responses of 767 examinees from a six-item CCS test were modeled using the Partial Credit Model (PCM) and four explanatory model extensions, each incorporating one predictor variable of interest. Predictor variables were the examinees' gender, the order in which examinees encountered an individual item ( item sequence), the time it took each examinee to respond to each item ( response time), and examinees' ability score on the multiple-choice part of the examination. Results: Results demonstrate a superior model fit for the explanatory PCM with examinee ability score from the multiple-choice portion of Step 3. Explanatory IRT model extensions might prove useful in complex performance assessment test settings where item calibrations are often problematic due to short tests and small samples. Recommendations: Findings of this study have great value in practice and implications for researchers working with small or complicated response data. Explanatory IRT methodology not only provides a way to improve data modeling for performance assessment tests but also enhances the inferences made by allowing important person predictors to be incorporated into a conventional IRT model.
引用
收藏
页码:117 / 134
页数:18
相关论文
共 50 条
  • [41] Computer-based lung sound simulation
    Kompsis, M
    Russi, EW
    MEDICAL & BIOLOGICAL ENGINEERING & COMPUTING, 1997, 35 (03) : 231 - 238
  • [42] Computer-based lung sound simulation
    M. Kompis
    E. W. Russi
    Medical & Biological Engineering & Computing, 1997, 35 : 231 - 238
  • [43] A Monte Carlo comparison of item and person statistics based on item response theory versus classical test theory
    MacDonald, P
    Paunonen, SV
    EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 2002, 62 (06) : 921 - 943
  • [44] Item response theory approach to ethnocentrism
    Monaghan, Conal
    Bizumic, Boris
    FRONTIERS IN POLITICAL SCIENCE, 2023, 5
  • [45] Item Response Theory - A First Approach
    Nunes, Sandra
    Oliveira, Teresa
    Oliveira, Amilcar
    PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2016 (ICNAAM-2016), 2017, 1863
  • [46] GENESIS-II - A COMPUTER-BASED CASE-MANAGEMENT SIMULATION
    CHUBON, RA
    REHABILITATION COUNSELING BULLETIN, 1986, 30 (01) : 25 - 32
  • [47] A Spectral Approach to Item Response Theory
    Nguyen, Duc
    Zhang, Anderson Y.
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [48] A COMPUTER-BASED CASE SIMULATION TO ASSESS SKILL IN PREDICTING CLIENT BEHAVIOR
    JANIKOWSKI, TP
    BERVEN, NL
    MEIXELSPERGER, MK
    ROEDL, KE
    REHABILITATION COUNSELING BULLETIN, 1989, 33 (02) : 127 - 139
  • [49] Construction of a reading literacy test item bank for fourth graders based on item response theory
    Chen, Qishan
    Zheng, Haiyan
    Fan, Honglan
    Mo, Lei
    FRONTIERS IN PSYCHOLOGY, 2023, 14
  • [50] Measurement efficiency of innovative item formats in computer-based testing
    Jodoin, MG
    JOURNAL OF EDUCATIONAL MEASUREMENT, 2003, 40 (01) : 1 - 15