Comparison of Selected-and Constructed-Response Items

被引:0
|
作者
Li, Haiying [1 ,2 ]
机构
[1] Iowa Coll Aid, Des Moines, IA 50309 USA
[2] Org Econ Cooperat & Dev TJA Fellow, F-75016 Paris, France
关键词
Constructed-response; PISA science; Selected-response;
D O I
10.1007/978-3-031-11647-6_70
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
A body of research on the assessment of scientific practices revealed that constructed-response items (CRI) are more valid for assessing scientific explanations than selected-response items (SRI). A few studies have compared the differences between these question item formats in small-scale formative science assessments. It is unclear, however, whether this phenomenon is universal in large-scale science assessments, which is within the scope of the present study. This study showed that one-third of students on average demonstrated inconsistent performance across 58 countries/regions when scientific practices were measured by SRI and CRI.
引用
收藏
页码:362 / 366
页数:5
相关论文
共 50 条
  • [1] Gender differences for constructed-response mathematics items
    Pomplun, M
    Capps, L
    EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 1999, 59 (04) : 597 - 614
  • [2] A Comparison of Two Standard-Setting Methods for Tests Consisting of Constructed-Response Items
    Ozarkan, Hatun Betul
    Dogan, Celal Deha
    EURASIAN JOURNAL OF EDUCATIONAL RESEARCH, 2020, (90): : 121 - 138
  • [3] Automated Scoring of Constructed-Response Science Items: Prospects and Obstacles
    Liu, Ou Lydia
    Brew, Chris
    Blackmore, John
    Gerard, Libby
    Madhok, Jacquie
    Linn, Marcia C.
    EDUCATIONAL MEASUREMENT-ISSUES AND PRACTICE, 2014, 33 (02) : 19 - 28
  • [4] Automatic scoring of constructed-response items with latent semantic analysis
    Lenhard, Wolfgang
    Baier, Herbert
    Hoffmann, Joachim
    Schneider, Wolfgang
    DIAGNOSTICA, 2007, 53 (03): : 155 - 165
  • [5] Weighting Constructed-Response items in IRT-based exams
    Sykes, RC
    Hou, LL
    APPLIED MEASUREMENT IN EDUCATION, 2003, 16 (04) : 257 - 275
  • [6] A Multimedia Effect for Multiple-Choice and Constructed-Response Test Items
    Lindner, Marlit A.
    Schult, Johannes
    Mayer, Richard E.
    JOURNAL OF EDUCATIONAL PSYCHOLOGY, 2022, 114 (01) : 72 - 88
  • [7] English Learners and Constructed-Response Science Test Items Challenges and Opportunities
    Noble, Tracy
    Wells, Craig S.
    Rosebery, Ann S.
    EDUCATIONAL ASSESSMENT, 2023, 28 (04) : 246 - 272
  • [8] Recommendations for preparing and scoring constructed-response items: What the experts say
    Hogan, Thomas P.
    Murphy, Gavin
    APPLIED MEASUREMENT IN EDUCATION, 2007, 20 (04) : 427 - 441
  • [9] Cognitive diagnostic models for tests with multiple-choice and constructed-response items
    Kuo, Bor-Chen
    Chen, Chun-Hua
    Yang, Chih-Wei
    Mok, Magdalena Mo Ching
    EDUCATIONAL PSYCHOLOGY, 2016, 36 (06) : 1115 - 1133
  • [10] An Item Response Tree Model for Items with Multiple-Choice and Constructed-Response Parts
    Wei, Junhuan
    Wang, Qin
    Dai, Buyun
    Cai, Yan
    Tu, Dongbo
    JOURNAL OF EDUCATIONAL MEASUREMENT, 2024,