Impact of test design, item quality, and item bank size on the psychometric properties of computer-based credentialing examinations

被引:23
|
作者
Xing, DH [1 ]
Hambleton, RK
机构
[1] Minist Educ, Vancouver, BC, Canada
[2] Univ Massachusetts, Amherst, MA 01003 USA
关键词
multistage tests; computer-based tests; test designs; credentialing exams;
D O I
10.1177/0013164403258393
中图分类号
G44 [教育心理学];
学科分类号
0402 ; 040202 ;
摘要
Computer-based testing by credentialing agencies has become common; however, selecting a test design is difficult because several good ones are available-parallel forms, computer adaptive (CAT), and multistage (MST). In this study, three computer-based test designs under some common examination conditions were investigated. Item bank size and item quality had a practically significant impact on decision consistency and accuracy. Even in nearly ideal situations, the choice of test design was not a factor in the results. Two conclusions follow from the findings: (a) More time and resources should be committed to expanding the size and quality of item banks, and (b) designs that individualize an exam administration such as MST and CAT may not be helpful when the primary purpose of the examination is to make pass-fail decisions and conditions are present for using parallel forms with a target information function that can be centered on the passing score.
引用
收藏
页码:5 / 21
页数:17
相关论文
共 29 条
  • [1] Measurement Properties of Two Innovative Item Formats in a Computer-Based Test
    Wan, Lei
    Henly, George A.
    APPLIED MEASUREMENT IN EDUCATION, 2012, 25 (01) : 58 - 78
  • [2] Item Attributes Analysis of Computer-based Test based on IRT
    Chen, Deng-Jyi
    Lai, Ah-Fur
    Chen, Shu-Ling
    TOWARDS SUSTAINABLE AND SCALABLE EDUCATIONAL INNOVATIONS INFORMED BY LEARNING SCIENCES, 2005, 133 : 638 - 641
  • [3] Item-by-item versus end-of-test feedback in a computer-based PSI course
    Buzhardt J.
    Semb G.B.
    Journal of Behavioral Education, 2002, 11 (2) : 89 - 104
  • [4] Comparison of the psychometric properties of several computer-based test designs for credentialing exams with multiple purposes
    Jodoin, Michael G.
    Zenisky, April
    Hambleton, Ronald K.
    APPLIED MEASUREMENT IN EDUCATION, 2006, 19 (03) : 203 - 220
  • [5] Using Response Time to Detect Item Preknowledge in Computer-Based Licensure Examinations
    Qian, Hong
    Staniewska, Dorota
    Reckase, Mark
    Woo, Ada
    EDUCATIONAL MEASUREMENT-ISSUES AND PRACTICE, 2016, 35 (01) : 38 - 47
  • [6] Examining Differential Item Functioning in a Computer-Based English Proficiency Test
    Min, Shangchao
    PROCEEDINGS OF 2015 YOUTH ACADEMIC FORUM ON LINGUISTICS, LITERATURE, TRANSLATION AND CULTURE, 2015, : 22 - 29
  • [7] The Psychometric development of an item bank and short forms that assess the impact of asthma on quality of life
    Stucky, Brian D.
    Edelen, Maria O.
    Eberhart, Nicole K.
    Lara, Marielena
    QUALITY OF LIFE RESEARCH, 2013, 22
  • [8] Item design considerations for computer-based testing of student learning in chemistry
    Bowen, CW
    JOURNAL OF CHEMICAL EDUCATION, 1998, 75 (09) : 1172 - 1175
  • [9] An Explanatory Item Response Theory Approach for a Computer-Based Case Simulation Test
    Kahraman, Nilufer
    EURASIAN JOURNAL OF EDUCATIONAL RESEARCH, 2014, (54): : 117 - 134
  • [10] A COMPUTER-BASED TEST ITEM-BANK FOR COGNITIVE ASSESSMENT OF MEDICAL-STUDENTS DURING A CLINICAL MEDICINE CLERKSHIP
    BROOKS, CM
    DISMUKES, WE
    WILLIAMS, GR
    BROWN, S
    MEDICAL EDUCATION, 1982, 16 (01) : 12 - 17