Efficiency Analysis of Item Response Theory Kernel Equating for Mixed-Format Tests

被引:0
|
作者
Wallmark, Joakim [1 ]
Josefsson, Maria [1 ]
Wiberg, Marie [1 ]
机构
[1] Umea Univ, Dept Stat, USBE, Umea, Sweden
关键词
kernel equating; presmoothing; item response theory; log-linear models; simulation; ASYMPTOTIC STANDARD ERRORS; CONSTRUCTED-RESPONSE; LINKING;
D O I
10.1177/01466216231209757
中图分类号
O1 [数学]; C [社会科学总论];
学科分类号
03 ; 0303 ; 0701 ; 070101 ;
摘要
This study aims to evaluate the performance of Item Response Theory (IRT) kernel equating in the context of mixed-format tests by comparing it to IRT observed score equating and kernel equating with log-linear presmoothing. Comparisons were made through both simulations and real data applications, under both equivalent groups (EG) and non-equivalent groups with anchor test (NEAT) sampling designs. To prevent bias towards IRT methods, data were simulated with and without the use of IRT models. The results suggest that the difference between IRT kernel equating and IRT observed score equating is minimal, both in terms of the equated scores and their standard errors. The application of IRT models for presmoothing yielded smaller standard error of equating than the log-linear presmoothing approach. When test data were generated using IRT models, IRT-based methods proved less biased than log-linear kernel equating. However, when data were simulated without IRT models, log-linear kernel equating showed less bias. Overall, IRT kernel equating shows great promise when equating mixed-format tests.
引用
收藏
页码:496 / 512
页数:17
相关论文
共 50 条
  • [1] Practical Consequences of Item Response Theory Model Misfit in the Context of Test Equating with Mixed-Format Test Data
    Zhao, Yue
    Hambleton, Ronald K.
    [J]. FRONTIERS IN PSYCHOLOGY, 2017, 8
  • [2] Impact of Accumulated Error on Item Response Theory Pre-Equating With Mixed Format Tests
    Keller, Lisa A.
    Keller, Robert
    Cook, Robert J.
    Colvin, Kimberly F.
    [J]. APPLIED MEASUREMENT IN EDUCATION, 2016, 29 (01) : 65 - 82
  • [3] Item Response Theory Models for Wording Effects in Mixed-Format Scales
    Wang, Wen-Chung
    Chen, Hui-Fang
    Jin, Kuan-Yu
    [J]. EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 2015, 75 (01) : 157 - 178
  • [4] A Comparison of Kernel Equating and Item Response Theory Equating Methods
    Akin-Arikan, Cigdem
    Gelbal, Selahattin
    [J]. EURASIAN JOURNAL OF EDUCATIONAL RESEARCH, 2021, (93): : 179 - 198
  • [5] Comparison of Kernel Equating and Kernel Local Equating in Item Response Theory Observed Score Equating
    Yildirim Seheryeli, Merve
    Yahsi Sari, Hasibe
    Kelecioglu, Hulya
    [J]. JOURNAL OF MEASUREMENT AND EVALUATION IN EDUCATION AND PSYCHOLOGY-EPOD, 2021, 12 (04): : 348 - 357
  • [6] Bi-Factor MIRT Observed-Score Equating for Mixed-Format Tests
    Lee, Guemin
    Lee, Won-Chan
    [J]. APPLIED MEASUREMENT IN EDUCATION, 2016, 29 (03) : 224 - 241
  • [7] Comparisons among Designs for Equating Mixed-Format Tests in Large-Scale Assessments
    Kim, Sooyeon
    Walker, Michael E.
    McHale, Frederick
    [J]. JOURNAL OF EDUCATIONAL MEASUREMENT, 2010, 47 (01) : 36 - 53
  • [8] The Impact of Item Feature and Response Preference in a Mixed-Format Design
    Chen, Hui-Fang
    Jin, Kuan-Yu
    [J]. MULTIVARIATE BEHAVIORAL RESEARCH, 2022, 57 (2-3) : 208 - 222
  • [9] The Effect Of Proportion Of Mixed-Format Scoring : Mixed-Format Achievement Tests
    Saen-amnuaiphon, R.
    Tuksino, P.
    Nichanong, C.
    [J]. INTERNATIONAL CONFERENCE ON EDUCATION & EDUCATIONAL PSYCHOLOGY (ICEEPSY 2012), 2012, 69 : 1522 - 1528
  • [10] Sequential Bayesian Ability Estimation Applied to Mixed-Format Item Tests
    Xiong, Jiawei
    Cohen, Allan S.
    Xiong, Xinhui
    [J]. APPLIED PSYCHOLOGICAL MEASUREMENT, 2023, 47 (5-6) : 402 - 419