Automated Essay Scoring System Based on Rubric

被引:3
|
作者
Yamamoto, Megumi [1 ]
Umemura, Nobuo [2 ]
Kawano, Hiroyuki [3 ]
机构
[1] Nagoya Univ Foreign Studies, Sch Contemporary Int Studies, Nisshin 4700197, Japan
[2] Nagoya Univ Arts & Sci, Sch Media & Design, Nagoya, Aichi 4700196, Japan
[3] Nanzan Univ, Fac Sci & Engn, Nagoya, Aichi 4668673, Japan
关键词
Automated scoring; Essay evaluation; Rubric; Cosine similarity; Support vector machine; Multiple regression model;
D O I
10.1007/978-3-319-64051-8_11
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we propose an architecture of automated essay scoring system based on rubric, which combines automated scoring with human scoring. Rubrics are valid criteria for grading students' essays. Our proposed rubric has five evaluation viewpoints "Contents, Structure, Evidence, Style, and Skill" and 25 evaluation items which are subdivided viewpoints. The system is cloud-based application and consists of several tools such as Moodle, R, MeCab, and RedPen. At first, the system automatically scores 11 items included in the Style and Skill such as sentence style, syntax, usage, readability, lexical richness, and so on. Then it predicts scores of Style and Skill from these items' scores by multiple regression model. It also predicts Contents' score by the cosine similarity between topics and descriptions. Moreover, our system classifies into five grades "A+, A, B, C, D" as useful information for teachers, by using machine learning techniques such as support vector machine. We try to improve automated scoring algorithms and a variety of input essays in order to improve accuracy of classification over 90%.
引用
收藏
页码:177 / 190
页数:14
相关论文
共 50 条
  • [21] A Multilingual Application for Automated Essay Scoring
    Castro-Castro, Daniel
    Lannes-Losada, Rocio
    Maritxalar, Montse
    Niebla, Ianire
    Perez-Marques, Celia
    Alamo-Suarez, Nancy C.
    Pons-Porrata, Aurora
    [J]. ADVANCES IN ARTIFICIAL INTELLIGENCE - IBERAMIA 2008, PROCEEDINGS, 2008, 5290 : 243 - 251
  • [22] The Impact of Anonymization for Automated Essay Scoring
    Shermis, Mark D.
    Lottridge, Sue
    Mayfield, Elijah
    [J]. JOURNAL OF EDUCATIONAL MEASUREMENT, 2015, 52 (04) : 419 - 436
  • [23] The Nature of Automated Essay Scoring Feedback
    Dikli, Semire
    [J]. CALICO JOURNAL, 2011, 28 (01): : 99 - 134
  • [24] Deep Learning in Automated Essay Scoring
    Boulanger, David
    Kumar, Vivekanandan
    [J]. INTELLIGENT TUTORING SYSTEMS, ITS 2018, 2018, 10858 : 294 - 299
  • [25] Automated essay scoring and flexible learning
    Li, RKY
    Oh, KH
    [J]. INFORMATION TECHNOLOGY AND ORGANIZATIONS: TRENDS, ISSUES, CHALLENGES AND SOLUTIONS, VOLS 1 AND 2, 2003, : 369 - 372
  • [26] Automated Essay Scoring: A comparative study
    Yao, Xuemei
    [J]. MECHANICAL ENGINEERING, MATERIALS SCIENCE AND CIVIL ENGINEERING, 2013, 274 : 650 - 653
  • [27] Automated essay scoring: A review of the field
    Lagakis, Paraskevas
    Demetriadis, Stavros
    [J]. PROCEEDINGS OF THE 2021 IEEE INTERNATIONAL CONFERENCE ON COMPUTER, INFORMATION, AND TELECOMMUNICATION SYSTEMS (IEEE CITS 2021), 2021, : 102 - 107
  • [28] Machine Learning-based Automated Essay Scoring System for Chinese Proficiency Test (HSK)
    Xiao, Rui
    Guo, Wenbin
    Zhang, Yunchun
    Ma, Xiaoyan
    Jiang, Jiaqi
    [J]. 2020 4TH INTERNATIONAL CONFERENCE ON NATURAL LANGUAGE PROCESSING AND INFORMATION RETRIEVAL, NLPIR 2020, 2020, : 18 - 23
  • [29] EssayGAN: Essay Data Augmentation Based on Generative Adversarial Networks for Automated Essay Scoring
    Park, Yo-Han
    Choi, Yong-Seok
    Park, Cheon-Young
    Lee, Kong-Joo
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (12):
  • [30] Determining Writing Genre: Towards a Rubric-based Approach to Automated Essay Grading
    Lam , Hon Wai
    Dillon, Tharam
    Chang, Elizabeth
    [J]. 25TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED INFORMATION NETWORKING AND APPLICATIONS (AINA 2011), 2011, : 270 - 274