A Multi-Dimensional Analysis of Writing Flexibility in an Automated Writing Evaluation System

被引:7
|
作者
Allen, Laura K. [1 ]
Likens, Aaron D. [2 ]
McNamara, Danielle S. [2 ]
机构
[1] Mississippi State Univ, Mississippi State, MS 39762 USA
[2] Arizona State Univ, Tempe, AZ USA
关键词
writing; flexibility; natural language processing; feedback; revision; INSTRUCTION; STUDENT;
D O I
10.1145/3170358.3170404
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
The assessment of writing proficiency generally includes analyses of the specific linguistic and rhetorical features contained in the singular essays produced by students. However, researchers have recently proposed that an individual's ability to flexibly adapt the linguistic properties of their writing might more closely capture writing skill. However, the features of the task, learner, and educational context that influence this flexibility remain largely unknown. The current study extends this research by examining relations between linguistic flexibility, reading comprehension ability, and feedback in the context of an automated writing evaluation system. Students (n = 131) wrote and revised six essays in an automated writing evaluation system and were provided both summative and formative feedback on their writing. Additionally, half of the students had access to a spelling and grammar checker that provided lower-level feedback during the writing period. The results provide evidence for the fact that developing writers demonstrate linguistic flexibility across the essays that they produce. However, analyses also indicate that lower-level feedback (i.e., spelling and grammar feedback) have little to no impact on the properties of students' essays nor on their variability across prompts or drafts. Overall, the current study provides important insights into the role of flexibility in writing skill and develops a strong foundation on which to conduct future research and educational interventions.
引用
收藏
页码:380 / 388
页数:9
相关论文
共 50 条
  • [21] An Overall Study on Automated Writing Evaluation
    魏蕊
    常青
    海外英语, 2014, (24) : 99 - 101
  • [22] A Literature Review on Automated Writing Evaluation
    杨曼
    海外英语, 2018, (03) : 227 - 228+232
  • [23] ReadME - Enhancing Automated Writing Evaluation
    Sirbu, Maria-Dorinela
    Botarleanu, Robert-Mihai
    Dascalu, Mihai
    Crossley, Scott A.
    Trausan-Matu, Stefan
    ARTIFICIAL INTELLIGENCE: METHODOLOGY, SYSTEMS, AND APPLICATIONS, AIMSA 2018, 2018, 11089 : 281 - 285
  • [24] Potential of Automated Writing Evaluation Feedback
    Cotos, Elena
    CALICO JOURNAL, 2011, 28 (02): : 420 - 458
  • [25] Neural automated writing evaluation for Korean L2 writing
    Lim, KyungTae
    Song, Jayoung
    Park, Jungyeul
    NATURAL LANGUAGE ENGINEERING, 2023, 29 (05) : 1341 - 1363
  • [26] Exploring Artificial Intelligence using Automated Writing Evaluation for Writing Skills
    Rahman, Nurul Ajleaa Abdul
    Zulkornain, Luqmanul Hakim
    Hamzah, Nurul Huda
    ENVIRONMENT-BEHAVIOUR PROCEEDINGS JOURNAL, 2022, 7 : 547 - 553
  • [27] Integrating Automated Writing Evaluation into Saudi EFL Students' Writing Practice
    Alshehri, Mona
    ARAB WORLD ENGLISH JOURNAL, 2025, 16 (01) : 70 - 87
  • [28] Exploring Artificial Intelligence using Automated Writing Evaluation for Writing Skills
    Rahman, Nurul Ajleaa Abdul
    Zulkornain, Luqmanul Hakim
    Hamzah, Nurul Huda
    ENVIRONMENT-BEHAVIOUR PROCEEDINGS JOURNAL, 2022, 7 : 547 - 553
  • [29] A Meta-analysis of Effects of Automated Writing Evaluation on Anxiety, Motivation, and Second Language Writing Skills
    Huang, Xiaoli
    Xu, Wei
    Li, Fan
    Yu, Zhonggen
    ASIA-PACIFIC EDUCATION RESEARCHER, 2024, 33 (04): : 957 - 976
  • [30] The effectiveness of automated writing evaluation in EFL/ESL writing: a three-level meta-analysis
    Ngo, Thuy Thi-Nhu
    Chen, Howard Hao-Jan
    Lai, Kyle Kuo-Wei
    INTERACTIVE LEARNING ENVIRONMENTS, 2024, 32 (02) : 727 - 744