• Title/Summary/Keyword: An Automated Writing Assessment System

Search Result 3, Processing Time 0.016 seconds

Automatic Adverb Error Correction in Korean Learners' EFL Writing

  • Kim, Jee-Eun
    • International Journal of Contents
    • /
    • v.5 no.3
    • /
    • pp.65-70
    • /
    • 2009
  • This paper describes ongoing work on the correction of adverb errors committed by Korean learners studying English as a foreign language (EFL), using an automated English writing assessment system. Adverb errors are commonly found in learners 'writings, but handling those errors rarely draws an attention in natural language processing due to complicated characteristics of adverb. To correctly detect the errors, adverbs are classified according to their grammatical functions, meanings and positions within a sentence. Adverb errors are collected from learners' sentences, and classified into five categories adopting a traditional error analysis. The error classification in conjunction with the adverb categorization is implemented into a set of mal-rules which automatically identifies the errors. When an error is detected, the system corrects the error and suggests error specific feedback. The feedback includes the types of errors, a corrected string of the error and a brief description of the error. This attempt suggests how to improve adverb error correction method as well as to provide richer diagnostic feedback to the learners.

Integration of Computerized Feedback to Improve Interactive Use of Written Feedback in English Writing Class

  • CHOI, Jaeho
    • Educational Technology International
    • /
    • v.12 no.2
    • /
    • pp.71-94
    • /
    • 2011
  • How can an automated essay scoring (AES) program, which provides feedback for essays, be a formative tool for improving ESL writing? In spite of the increasing demands for English writing proficiency, English writing instruction has not been effective for teaching and learning because of a lack of timely and accurate feedback. In this context, AES as a possible solution has been gaining the attention of educators and scholars in ESL/EFL writing education because it can provide consistent and prompt feedback for student writers. This experimental study examined the impact of different types of feedback for a college ESL writing program using the Criterion AES system. The results reveal the positive impact of AES in a college-level ESL course and differences between the teacher's feedback and the AES feedback. The findings suggest that AES can be effectively integrated into ESL writing instruction as a formative assessment tool.

Effect of Application of Ensemble Method on Machine Learning with Insufficient Training Set in Developing Automated English Essay Scoring System (영작문 자동채점 시스템 개발에서 학습데이터 부족 문제 해결을 위한 앙상블 기법 적용의 효과)

  • Lee, Gyoung Ho;Lee, Kong Joo
    • Journal of KIISE
    • /
    • v.42 no.9
    • /
    • pp.1124-1132
    • /
    • 2015
  • In order to train a supervised machine learning algorithm, it is necessary to have non-biased labels and a sufficient amount of training data. However, it is difficult to collect the required non-biased labels and a sufficient amount of training data to develop an automatic English Composition scoring system. In addition, an English writing assessment is carried out using a multi-faceted evaluation of the overall level of the answer. Therefore, it is difficult to choose an appropriate machine learning algorithm for such work. In this paper, we show that it is possible to alleviate these problems through ensemble learning. The results of the experiment indicate that the ensemble technique exhibited an overall performance that was better than that of other algorithms.