Automated Japanese essay scoring system based on articles written by experts

  • Authors:
  • Tsunenori Ishioka;Masayuki Kameda

  • Affiliations:
  • The National Center for University Entrance Examinations, Tokyo, Japan;Software Research Center, Ricoh Co., Ltd., Tokyo, Japan

  • Venue:
  • ACL-44 Proceedings of the 21st International Conference on Computational Linguistics and the 44th annual meeting of the Association for Computational Linguistics
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We have developed an automated Japanese essay scoring system called Jess. The system needs expert writings rather than expert raters to build the evaluation model. By detecting statistical outliers of predetermined aimed essay features compared with many professional writings for each prompt, our system can evaluate essays. The following three features are examined: (1) rhetoric --- syntactic variety, or the use of various structures in the arrangement of phases, clauses, and sentences, (2) organization --- characteristics associated with the orderly presentation of ideas, such as rhetorical features and linguistic cues, and (3) content --- vocabulary related to the topic, such as relevant information and precise or specialized vocabulary. The final evaluation score is calculated by deducting from a perfect score assigned by a learning process using editorials and columns from the Mainichi Daily News newspaper. A diagnosis for the essay is also given.