On the automatic scoring of handwritten essays

  • Authors:
  • Sargur Srihari;Rohini Srihari;Pavithra Babu;Harish Srinivasan

  • Affiliations:
  • Center of Excellence for Document Analysis and Recognition, University at Buffalo, State University of New York Amherst, New York;Center of Excellence for Document Analysis and Recognition, University at Buffalo, State University of New York Amherst, New York;Center of Excellence for Document Analysis and Recognition, University at Buffalo, State University of New York Amherst, New York;Center of Excellence for Document Analysis and Recognition, University at Buffalo, State University of New York Amherst, New York

  • Venue:
  • IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Automating the task of scoring short handwritten student essays is considered. The goal is to assign scores which are comparable to those of human scorers by coupling two AI technologies: optical handwriting recognition and automated essay scoring. The test-bed is that of essays written by children in reading comprehension tests. The process involves several image-level operations: removal of pre-printed matter, segmentation of handwritten text lines and extraction of words. Recognition constraints are provided by the reading passage, the question and the answer rubric. Scoring is based on using a vector space model and machine learning of parameters from a set of human-scored samples. System performance is comparable to that of scoring based on perfect manual transcription.