Item Selection and Ability Estimation Procedures for a Mixed-Format Adaptive Test

被引:3
|
作者
Ho, Tsung-Han [1 ]
Dodd, Barbara G. [1 ]
机构
[1] Univ Texas Austin, Austin, TX 78712 USA
关键词
PARTIAL CREDIT MODEL; LIKELIHOOD-ESTIMATION; EXPOSURE; CRITERIA; STRATEGIES;
D O I
10.1080/08957347.2012.714686
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
In this study we compared five item selection procedures using three ability estimation methods in the context of a mixed-format adaptive test based on the generalized partial credit model. The item selection procedures used were maximum posterior weighted information, maximum expected information, maximum posterior weighted Kullback-Leibler information, and maximum expected posterior weighted Kullback-Leibler information procedures. The ability estimation methods investigated were maximum likelihood estimation (MLE), weighted likelihood estimation (WLE), and expected a posteriori (EAP). Results suggested that all item selection procedures, regardless of the information functions on which they were based, performed equally well across ability estimation methods. The principal conclusions drawn about the ability estimation methods are that MLE is a practical choice and WLE should be considered when there is a mismatch between pool information and the population ability distribution. EAP can serve as a viable alternative when an appropriate prior ability distribution is specified. Several implications of the findings for applied measurement are discussed.
引用
收藏
页码:305 / 326
页数:22
相关论文
共 50 条
  • [31] Item selection rules in a Computerized Adaptive Test for the assessment of written English.
    Ramon Barrada, Juan
    Olea, Julio
    Ponsoda, Vicente
    Abad, Francisco J.
    [J]. PSICOTHEMA, 2006, 18 (04) : 828 - 834
  • [32] Computer adaptive practice of Maths ability using a new item response model for on the fly ability and difficulty estimation
    Klinkenberg, S.
    Straatemeier, M.
    van der Maas, H. L. J.
    [J]. COMPUTERS & EDUCATION, 2011, 57 (02) : 1813 - 1824
  • [33] Computerized Adaptive Testing in Early Education: Exploring the Impact of Item Position Effects on Ability Estimation
    Albano, Anthony D.
    Cai, Liuhan
    Lease, Erin M.
    McConnell, Scott R.
    [J]. JOURNAL OF EDUCATIONAL MEASUREMENT, 2019, 56 (02) : 437 - 451
  • [34] Effects of Calibration Sample Size and Item Bank Size on Ability Estimation in Computerized Adaptive Testing
    Sahin, Alper
    Weiss, David J.
    [J]. EDUCATIONAL SCIENCES-THEORY & PRACTICE, 2015, 15 (06): : 1585 - 1595
  • [35] Simple Estimation and Test Procedures in Capture-Mark-Recapture Mixed Models
    Lebreton, J. D.
    Choquet, R.
    Gimenez, O.
    [J]. BIOMETRICS, 2012, 68 (02) : 494 - 503
  • [36] Computerized adaptive testing with the partial credit model: Estimation procedures, population distributions, and item pool characteristics
    Gorin, JS
    Dodd, BG
    Fitzpatrick, SJ
    Shieh, YY
    [J]. APPLIED PSYCHOLOGICAL MEASUREMENT, 2005, 29 (06) : 433 - 456
  • [37] Ability estimation in computerized adaptive test using Mamdani Fuzzy Inference System
    Ridwan, W.
    Wiranto, I
    Dako, R. D. R.
    [J]. INTERNATIONAL SYMPOSIUM ON MATERIALS AND ELECTRICAL ENGINEERING 2019 (ISMEE 2019), 2020, 850
  • [38] The analysis of response patterns on IRT ability estimation methods in computerized adaptive test
    Chen, Deng-Jyi
    Lai, Ah-Fur
    Mao, Chia-Chi
    [J]. 7TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED LEARNING TECHNOLOGIES, PROCEEDINGS, 2007, : 721 - +
  • [39] A study of Quality Assessment of Science Instructional Management in Thailand: An analysis of Differential Item Functioning and Test Functioning in Mixed format Tests
    Chanpleng, Panat
    Lawthong, Nuttaporn
    Ngudgratoke, Sungwon
    [J]. Proceedings of 6th World Conference on Educational Sciences, 2015, 191 : 121 - 125
  • [40] The study of the effect of item parameter drift on ability estimation obtained from adaptive testing under different conditions
    Kursad, Merve Sahin
    Bokeoglu, Omay Cokluk
    Cikrikci, Rahime Nukhet
    [J]. INTERNATIONAL JOURNAL OF ASSESSMENT TOOLS IN EDUCATION, 2022, 9 (03): : 654 - 681