Measuring the Impact of Peer Interaction in Group Oral Assessments with an Extended Many-Facet Rasch Model

被引:2
|
作者
Jin, Kuan-Yu [1 ]
Eckes, Thomas [2 ]
机构
[1] Hong Kong Examinat & Assessment Author, Assessment Technol & Res Div, 7-F,248 Queens Rd East, Hong Kong, Peoples R China
[2] Univ Bochum, TestDaF Inst, Psychometr & Res Methodol Dept, Univstr 134, D-44799 Bochum, Germany
关键词
INFORMATION CRITERION; PERFORMANCE;
D O I
10.1111/jedm.12375
中图分类号
G44 [教育心理学];
学科分类号
0402 ; 040202 ;
摘要
Many language proficiency tests include group oral assessments involving peer interaction. In such an assessment, examinees discuss a common topic with others. Human raters score each examinee's spoken performance on specially designed criteria. However, measurement models for analyzing group assessment data usually assume local person independence and thus fail to consider the impact of peer interaction on the assessment outcomes. This research advances an extended many-facet Rasch model for group assessments (MFRM-GA), accounting for local person dependence. In a series of simulations, we examined the MFRM-GA's parameter recovery and the consequences of ignoring peer interactions under the traditional modeling approach. We also used a real dataset from the English-speaking test of the Language Proficiency Assessment for Teachers (LPAT) routinely administered in Hong Kong to illustrate the efficiency of the new model. The discussion focuses on the model's usefulness for measuring oral language proficiency, practical implications, and future research perspectives.
引用
收藏
页码:47 / 68
页数:22
相关论文
共 49 条