Computerized Adaptive Testing for Cognitively Based Multiple-Choice Data

被引:21
|
作者
Yigit, Hulya D. [1 ]
Sorrel, Miguel A. [2 ]
de la Torre, Jimmy [3 ]
机构
[1] Univ Illinois, Champaign, IL USA
[2] Univ Autonoma Madrid, Madrid, Spain
[3] Univ Hong Kong, Hong Kong, Peoples R China
关键词
cognitive diagnosis models; computerized adaptive testing; MC-DINA; G-DINA; item selection methods; !text type='JS']JS[!/text]D; GDI; ITEM SELECTION; DIAGNOSIS MODEL;
D O I
10.1177/0146621618798665
中图分类号
O1 [数学]; C [社会科学总论];
学科分类号
03 ; 0303 ; 0701 ; 070101 ;
摘要
Cognitive diagnosis models (CDMs) are latent class models that hold great promise for providing diagnostic information about student knowledge profiles. The increasing use of computers in classrooms enhances the advantages of CDMs for more efficient diagnostic testing by using adaptive algorithms, referred to as cognitive diagnosis computerized adaptive testing (CD-CAT). When multiple-choice items are involved, CD-CAT can be further improved by using polytomous scoring (i.e., considering the specific options students choose), instead of dichotomous scoring (i.e., marking answers as either right or wrong). In this study, the authors propose and evaluate the performance of the Jensen-Shannon divergence (JSD) index as an item selection method for the multiple-choice deterministic inputs, noisy "and" gate (MC-DINA) model. Attribute classification accuracy and item usage are evaluated under different conditions of item quality and test termination rule. The proposed approach is compared with the random selection method and an approximate approach based on dichotomized responses. The results show that under the MC-DINA model, JSD improves the attribute classification accuracy significantly by considering the information from distractors, even with a very short test length. This result has important implications in practical classroom settings as it can allow for dramatically reduced testing times, thus resulting in more targeted learning opportunities.
引用
收藏
页码:388 / 401
页数:14
相关论文
共 50 条
  • [41] MULTIPLE-CHOICE QUESTIONS
    IONA, M
    PHYSICS TEACHER, 1983, 21 (09): : 568 - 568
  • [42] MULTIPLE-CHOICE QUESTIONS
    IONA, M
    AMERICAN JOURNAL OF PHYSICS, 1984, 52 (05) : 392 - 392
  • [43] AUTOSCAN - AN ADAPTIVE MULTIPLE-CHOICE BASIC COMPUTER-PROGRAM
    BURNS, E
    JOURNAL OF SCHOOL PSYCHOLOGY, 1988, 26 (03) : 311 - 315
  • [44] MULTIPLE-CHOICE QUESTIONS
    CONWAY, CM
    ANAESTHESIA, 1984, 39 (07) : 715 - 715
  • [45] The equation for medical multiple-choice question testing time estimation
    Kreepala, Chatchai
    Keeratibharat, Nattawut
    Aekgawong, Sekdusit
    Wattanavaekin, Krittanont
    Danjittrong, Taechasit
    Juntararuangtong, Thitikorn
    Chombandit, Theetad
    ANNALS OF MEDICINE AND SURGERY, 2024, 86 (05): : 2688 - 2695
  • [46] Memorial consequences of multiple-choice testing on immediate and delayed tests
    Fazio, Lisa K.
    Agarwal, Pooja K.
    Marsh, Elizabeth J.
    Roediger, Henry L., III
    MEMORY & COGNITION, 2010, 38 (04) : 407 - 418
  • [47] Knowledge Assessment: Squeezing Information From Multiple-Choice Testing
    Nickerson, Raymond S.
    Butler, Susan F.
    Carlin, Michael T.
    JOURNAL OF EXPERIMENTAL PSYCHOLOGY-APPLIED, 2015, 21 (02) : 167 - 177
  • [48] Memorial consequences of multiple-choice testing on immediate and delayed tests
    Lisa K. Fazio
    Pooja K. Agarwal
    Elizabeth J. Marsh
    Henry L. Roediger
    Memory & Cognition, 2010, 38 : 407 - 418
  • [49] Valuing Assessment in Teacher Education - Multiple-choice Competency Testing
    Martin, Dona L.
    Itter, Diane
    AUSTRALIAN JOURNAL OF TEACHER EDUCATION, 2014, 39 (07): : 1 - 14
  • [50] The "None of the Above" Option in Multiple-Choice Testing: An Experimental Study
    DiBattista, David
    Sinnige-Egger, Jo-Anne
    Fortuna, Glenda
    JOURNAL OF EXPERIMENTAL EDUCATION, 2014, 82 (02): : 168 - 183