Gender differences in constructed response reading items

被引:6
|
作者
Pomplun, M [1 ]
Sundbye, N [1 ]
机构
[1] Univ Kansas, Ctr Educ Testing & Evaluat, Lawrence, KS 66049 USA
关键词
D O I
10.1207/s15324818ame1201_6
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
In this study, we explored gender differences in answers to constructed response reading items from a state assessment program. Construct related and unrelated factors were identified by a content expert after a review of the literature and a pilot study. Four raters were trained to score the identified features on approximately 500 papers evenly divided across 2 grade levels and between genders. These features included correctness of answer, unrelated answers, inaccurate answers, number of words written, and a measure of syntactic complexity. The papers rated for these features had already been assigned holistic scores by local teachers using a state-provided rubric. The relations between these studied features, holistic scores, objective scores, and gender differences was explored through correlations and regression analyses. The results indicate that number of words written and number of unrelated responses showed significant gender differences, were related to holistic scores, and were significant even when the other studied variables were controlled statistically. Further research is needed to investigate how these features could influence raters differentially for male and female students.
引用
收藏
页码:95 / 109
页数:15
相关论文
共 50 条
  • [31] Gender differences in reading ability and attitudes: examining where these differences lie
    Logan, Sarah
    Johnston, Rhona
    JOURNAL OF RESEARCH IN READING, 2009, 32 (02) : 199 - 214
  • [32] An Item Response Tree Model for Items with Multiple-Choice and Constructed-Response Parts
    Wei, Junhuan
    Wang, Qin
    Dai, Buyun
    Cai, Yan
    Tu, Dongbo
    JOURNAL OF EDUCATIONAL MEASUREMENT, 2024,
  • [33] English Learners and Constructed-Response Science Test Items Challenges and Opportunities
    Noble, Tracy
    Wells, Craig S.
    Rosebery, Ann S.
    EDUCATIONAL ASSESSMENT, 2023, 28 (04) : 246 - 272
  • [34] A HO-IRT BASED DIAGNOSTIC ASSESSMENT SYSTEM WITH CONSTRUCTED RESPONSE ITEMS
    Yang, Chih-Wei
    Kuo, Bor-Chen
    Liao, Chen-Huei
    TURKISH ONLINE JOURNAL OF EDUCATIONAL TECHNOLOGY, 2011, 10 (04): : 46 - 51
  • [35] Constructed Response Items as an Assessment Method for Undergraduate Medical Course: Improving the Validity
    Somannavar, Manjunath S.
    JOURNAL OF THE SCIENTIFIC SOCIETY, 2019, 46 (01) : 8 - 10
  • [36] Using Differential Item Functioning to Test for Interrater Reliability in Constructed Response Items
    Walker, Cindy M.
    Sahin, Sakine Gocer
    EDUCATIONAL AND PSYCHOLOGICAL MEASUREMENT, 2020, 80 (04) : 808 - 820
  • [37] Recommendations for preparing and scoring constructed-response items: What the experts say
    Hogan, Thomas P.
    Murphy, Gavin
    APPLIED MEASUREMENT IN EDUCATION, 2007, 20 (04) : 427 - 441
  • [38] A Multimedia Effect for Multiple-Choice and Constructed-Response Test Items
    Lindner, Marlit A.
    Schult, Johannes
    Mayer, Richard E.
    JOURNAL OF EDUCATIONAL PSYCHOLOGY, 2022, 114 (01) : 72 - 88
  • [39] Learning to Score: A Coding System for Constructed Response Items via Interactive Clustering
    Luo, Lingjing
    Yang, Hang
    Li, Zhiwu
    Pedrycz, Witold
    SYSTEMS, 2024, 12 (09):
  • [40] Gender differences in reading performance on documents across countries
    Monica Rosén
    Reading and Writing, 2001, 14 : 1 - 38