The effectiveness of automated writing evaluation in EFL/ESL writing: a three-level meta-analysis

被引:23
|
作者
Ngo, Thuy Thi-Nhu [1 ]
Chen, Howard Hao-Jan [1 ]
Lai, Kyle Kuo-Wei [1 ]
机构
[1] Natl Taiwan Normal Univ, English Dept, Taipei, Taiwan
关键词
Automated writing evaluation; AWE; writing; meta-analysis; effectiveness; IMPACT; FEEDBACK; DESIGN;
D O I
10.1080/10494820.2022.2096642
中图分类号
G40 [教育学];
学科分类号
040101 ; 120403 ;
摘要
The present study performs a three-level meta-analysis to investigate the overall effectiveness of automated writing evaluation (AWE) on EFL/ESL student writing performance. 24 primary studies representing 85 between-group effect sizes and 34 studies representing 178 within-group effect sizes found from 1993 to 2021 were separately meta-analyzed. The results indicated a medium overall between-group effect size (g = 0.59) and a large overall within-group effect size (g = 0.98) of AWE on student writing performance. Analyses of moderators show that: (1)- AWE is more effective in improving vocabulary usage but less effective in improving grammar in students' writing; (2)- Grammarly shows potential in being a highly effective tool, though Pigai did not demonstrate such effectiveness; (3)- Medium to long duration of AWE usage leads to a higher effect, but short duration leads to a lower effect in writing outcome compared to non-AWE treatment; (4)- Studying with peers in AWE condition potentially produces a large effect; (5)- AWE is beneficial to students at the undergraduate level, students in the EFL context, and students with intermediate English proficiency. Directions for future research are also discussed in the present study. Overall, AWE is a beneficial application and is recommended for integration in the writing classroom.
引用
收藏
页码:727 / 744
页数:18
相关论文
共 50 条
  • [21] Effectiveness of Systemic Text Analysis in EFL Writing Instruction
    Velasco Tovar, Ender
    GIST-Education and Learning Research Journal, 2016, (13): : 11 - 33
  • [22] On Automated Online Essay Scoring in Chinese EFL Writing Evaluation
    Yuan, Xi-ming
    INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND SOFTWARE ENGINEERING (AISE 2014), 2014, : 527 - 530
  • [23] A comprehensive three-level meta-analysis of the positive discipline programme: effectiveness and moderating variables
    Yu, Yanni
    Fang, Jingwei
    Chen, Huan
    EARLY CHILD DEVELOPMENT AND CARE, 2024, 194 (15-16) : 1468 - 1484
  • [24] Writing to Read: A Meta-Analysis of the Impact of Writing and Writing Instruction on Reading
    Graham, Steve
    Hebert, Michael
    HARVARD EDUCATIONAL REVIEW, 2011, 81 (04) : 710 - 744
  • [25] Automated Writing Evaluation for ESL Learners: A Case Study of Pigai System
    Yao, Don
    JOURNAL OF ASIA TEFL, 2021, 18 (03): : 949 - 958
  • [26] Three-level meta-analysis of dependent effect sizes
    Van den Noortgate, Wim
    Antonio Lopez-Lopez, Jose
    Marin-Martinez, Fulgencio
    Sanchez-Meca, Julio
    BEHAVIOR RESEARCH METHODS, 2013, 45 (02) : 576 - 594
  • [27] Three-level meta-analysis of dependent effect sizes
    Wim Van den Noortgate
    José Antonio López-López
    Fulgencio Marín-Martínez
    Julio Sánchez-Meca
    Behavior Research Methods, 2013, 45 : 576 - 594
  • [28] Divorce and physical health: A three-level meta-analysis
    Pellon-Elexpuru, Ines
    Van Dijk, Rianne
    Van der Valk, Inge
    Martinez-Pampliega, Ana
    Molleda, Asier
    Cormenzana, Susana
    SOCIAL SCIENCE & MEDICINE, 2024, 352
  • [29] Analysis of writing strategies of EFL writers at the secondary level
    郑丽妍
    校园英语, 2019, (43) : 91 - 92
  • [30] Teachers in automated writing evaluation (AWE) system-supported ESL writing classes: Perception, implementation, and influence
    Li, Zhi
    SYSTEM, 2021, 99