Bidirectional language model for handwriting recognition

  • Authors:
  • Volkmar Frinken;Alicia Fornés;Josep Lladós;Jean-Marc Ogier

  • Affiliations:
  • Computer Vision Center, Dept. of Computer Science, Edifici O, UAB, Spain;Computer Vision Center, Dept. of Computer Science, Edifici O, UAB, Spain;Computer Vision Center, Dept. of Computer Science, Edifici O, UAB, Spain;L3i Laboratory, Université de La Rochelle, La Rochelle Cédex 1, France

  • Venue:
  • SSPR'12/SPR'12 Proceedings of the 2012 Joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In order to improve the results of automatically recognized handwritten text, information about the language is commonly included in the recognition process. A common approach is to represent a text line as a sequence. It is processed in one direction and the language information via n-grams is directly included in the decoding. This approach, however, only uses context on one side to estimate a word's probability. Therefore, we propose a bidirectional recognition in this paper, using distinct forward and a backward language models. By combining decoding hypotheses from both directions, we achieve a significant increase in recognition accuracy for the off-line writer independent handwriting recognition task. Both language models are of the same type and can be estimated on the same corpus. Hence, the increase in recognition accuracy comes without any additional need for training data or language modeling complexity.