The Effectiveness of Automated Writing Evaluation on Writing Quality: A Meta-Analysis

被引:29
|
作者
Zhai, Na [1 ,2 ]
Ma, Xiaomei [1 ]
机构
[1] Xi An Jiao Tong Univ, Sch Foreign Languages, 28 West Xianning Rd, Xian 710049, Peoples R China
[2] Xian Fanyi Univ, Sch Translat Studies, Xian, Peoples R China
关键词
automated writing evaluation; writing quality; meta-analysis; effect size; FEEDBACK; IMPACT; ENGLISH; COMPUTER; ACCURACY;
D O I
10.1177/07356331221127300
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
Automated writing evaluation (AWE) has been frequently used to provide feedback on student writing. Many empirical studies have examined the effectiveness of AWE on writing quality, but the results were inconclusive. Thus, the magnitude of AWE'S overall effect and factors influencing its effectiveness across studies remained unclear. This study re-examined the issue by meta-analyzing the results of 26 primary studies with a total of 2468 participants from 2010 to 2022. The results revealed that AWE had a large positive overall effect on writing quality (g = 0.861, p < 0.001). Further moderator analyses indicated that AWE was more effective for post-secondary students than for secondary students and had more benefits for English as a Foreign Language (EFL) and English as a Second Language (ESL) learners than for Native English Speaker (NES) learners. When the genre of writing was considered, AWE showed a more significant impact on argumentative writing than on academic and mixed writing genres. However, intervention duration, feedback combination, and AWE platform did not moderate the effect of AWE on writing quality. The implications and recommendations for both research and practice are discussed in depth.
引用
收藏
页码:875 / 900
页数:26
相关论文
共 50 条
  • [11] The Process Writing Approach: A Meta-analysis
    Graham, Steve
    Sandmel, Karin
    JOURNAL OF EDUCATIONAL RESEARCH, 2011, 104 (06): : 396 - 407
  • [12] FORMATIVE ASSESSMENT AND WRITING A Meta-Analysis
    Graham, Steve
    Hebert, Michael
    Harris, Karen R.
    ELEMENTARY SCHOOL JOURNAL, 2015, 115 (04): : 523 - 547
  • [13] Automated writing evaluation
    Hockly, Nicky
    ELT JOURNAL, 2019, 73 (01) : 82 - 88
  • [14] Improving Writing Skills of Students in Turkey: a Meta-analysis of Writing Interventions
    Graham, Steve
    Tavsanli, Omer Faruk
    Kaldirim, Abdullah
    EDUCATIONAL PSYCHOLOGY REVIEW, 2022, 34 (02) : 889 - 934
  • [15] Improving Writing Skills of Students in Turkey: a Meta-analysis of Writing Interventions
    Steve Graham
    Omer Faruk Tavsanli
    Abdullah Kaldirim
    Educational Psychology Review, 2022, 34 : 889 - 934
  • [16] Interpreting the effectiveness of academic English writing programmes in higher education: A meta-analysis
    Lee, Hansol
    Lee, Jang Ho
    OXFORD REVIEW OF EDUCATION, 2023, 49 (03) : 281 - 300
  • [17] A Multi-Dimensional Analysis of Writing Flexibility in an Automated Writing Evaluation System
    Allen, Laura K.
    Likens, Aaron D.
    McNamara, Danielle S.
    PROCEEDINGS OF THE 8TH INTERNATIONAL CONFERENCE ON LEARNING ANALYTICS & KNOWLEDGE (LAK'18): TOWARDS USER-CENTRED LEARNING ANALYTICS, 2018, : 380 - 388
  • [18] Automated Writing Evaluation Tools in the Improvement of the Writing Skill
    Parra G, Lorena
    Calero S, Ximena
    INTERNATIONAL JOURNAL OF INSTRUCTION, 2019, 12 (02) : 209 - 226
  • [19] ChatGPT for Automated Writing Evaluation in Scholarly Writing Instruction
    Parker, Jessica L.
    Becker, Kimberly
    Carroca, Catherine
    JOURNAL OF NURSING EDUCATION, 2023, 62 (12) : 721 - 727
  • [20] Automated feedback and writing: a multi-level meta-analysis of effects on students' performance
    Fleckenstein, Johanna
    Liebenow, Lucas W.
    Meyer, Jennifer
    FRONTIERS IN ARTIFICIAL INTELLIGENCE, 2023, 6