Automated essay evaluation software in English Language Arts classrooms: Effects on teacher feedback, student motivation, and writing quality

被引:126
|
作者
Wilson, Joshua [1 ]
Czik, Amanda [1 ]
机构
[1] Univ Delaware, Newark, DE 19716 USA
关键词
Automated essay evaluation; Interactive learning environments; Writing; English Language Arts; ELEMENTARY; INSTRUCTION; ACHIEVEMENT; WORD;
D O I
10.1016/j.compedu.2016.05.004
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Automated Essay Evaluation (AEE) systems are being increasingly adopted in the United States to support writing instruction. AEE systems are expected to assist teachers in providing increased higher-level feedback and expediting the feedback process, while supporting gains in students' writing motivation and writing quality. The current study explored these claims using a quasi-experimental study. Four eighth-grade English Language Arts (ELA) classes were assigned to a combined feedback condition in which they received feedback on their writing from their teacher and from an automated essay evaluation (AEE) system called PEG Writings. Four other eighth-grade ELA classes were assigned to a teacher feedback-only condition, in which they received feedback from their teacher via GoogleDocs. Results indicated that teachers gave the same median amount feedback to students in both condition, but gave proportionately more feedback on higher level writing skills to students in the combined PEG + Teacher Feedback condition. Teachers also agreed that PEG assisted them in saving one-third to half the time it took to provide feedback when they were the sole source of feedback (i.e., the GoogleDocs condition). At the conclusion of the study, students in the combined feedback condition demonstrated increases in writing persistence, though there were no differences between groups with regard to final-draft writing quality. (C) 2016 Elsevier Ltd. All rights reserved.
引用
收藏
页码:94 / 109
页数:16
相关论文
共 19 条