Classification Accuracy of Mixed Format Tests: A Bi-Factor Item Response Theory Approach

被引:12
|
作者
Wang, Wei [1 ]
Drasgow, Fritz [2 ,3 ]
Liu, Liwen [2 ,4 ]
机构
[1] Univ Cent Florida, Dept Psychol, Orlando, FL 32816 USA
[2] Univ Illinois, Dept Psychol, Champaign, IL USA
[3] Univ Illinois, Sch Labor & Employment Relat, Champaign, IL USA
[4] Amer Inst Res, Washington, DC USA
来源
FRONTIERS IN PSYCHOLOGY | 2016年 / 7卷
关键词
mixed format test; bi-factor model; item response theory; constructed response items; classification accuracy; MULTIPLE-CHOICE TESTS; CONSTRUCTED-RESPONSE; BIFACTOR MODEL; SUBSCORES; DIMENSIONALITY;
D O I
10.3389/fpsyg.2016.00270
中图分类号
B84 [心理学];
学科分类号
04 ; 0402 ;
摘要
Mixed format tests (e.g., a test consisting of multiple-choice [MC] items and constructed response [CR] items) have become increasingly popular. However, the latent structure of item pools consisting of the two formats is still equivocal. Moreover, the implications of this latent structure are unclear: For example, do constructed response items tap reasoning skills that cannot be assessed with multiple choice items? This study explored the dimensionality of mixed format tests by applying bi-factor models to 10 tests of various subjects from the College Board's Advanced Placement (AP) Program and compared the accuracy of scores based on the bi-factor analysis with scores derived from a unidimensional analysis. More importantly, this study focused on a practical and important question classification accuracy of the overall grade on a mixed format test. Our findings revealed that the degree of multidimensionality resulting from the mixed item format varied from subject to subject, depending on the disattenuated correlation between scores from MC and CR subtests. Moreover, remarkably small decrements in classification accuracy were found for the unidimensional analysis when the disattenuated correlations exceeded 0.90.
引用
收藏
页数:11
相关论文
共 50 条