Decision making in Markov chains applied to the problem of pattern recognition

  • Authors:
  • J. Raviv

  • Affiliations:
  • -

  • Venue:
  • IEEE Transactions on Information Theory
  • Year:
  • 2006

Quantified Score

Hi-index 754.84

Visualization

Abstract

In many pattern-recognition problems there exist dependencies among the patterns to be recognized. In the past, these dependencies have not been introduced into the mathematical model when designing an optimal pattern-recognition system. In this paper the optimal decision rule is derived under the assumption of Markov dependence among the patterns to be recognized. Subsequently, this decision rule is applied to character-recognition problems. The main idea is to balance appropriately the information which is obtained from contextual considerations and the information from measurements on the character being recognized and thus arrive at a decision using both. Bayes' decision in Markov chains is presented and this mode of decision is adapted to character recognition. A look-ahead mode of decision is presented. The problem of estimation of transition probabilities is discussed. The experimental system is described and results of experiments on English legal text and names are presented.