Handwriting Recognition Using Position Sensitive Letter N-Gram Matching
ICDAR '03 Proceedings of the Seventh International Conference on Document Analysis and Recognition - Volume 1
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Offline Recognition of Unconstrained Handwritten Texts Using HMMs and Statistical Language Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Combining Character Classifiers Using Member Classifiers Assessment
ISDA '05 Proceedings of the 5th International Conference on Intelligent Systems Design and Applications
Application of bidirectional probabilistic character language model in handwritten words recognition
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
Hi-index | 0.00 |
In this paper two level model of handwritten word recognition is considered. On the first level the consecutive letters are recognized by the same classifier using preprocessed data from the optical device, while on the second level we try to recognize the whole word. From the other point of view we can treat the first level as a feature reduction level. Then, in this paper two different methods of feature reduction for handwritten word recognition algorithm are described. On the lower level different well-known in the literature methods are taken into account (for example multi layer perceptron, k-NN algorithm). The results of classification from the first level serve as a feature for the second level and two different cases are considered. The first one consist in taking into account the crisp result of classification from the first level while in the second approach we take into account the support vector of decision on this level. On the second level, in order to improve the word recognition accuracy, for both methods, the Probabilistic Character Level Language Model was applied. In this model, the assumption of first-order Markov dependence in the sequence of characters was made. Moreover, we comment the possibility of using Markov model in forward and backward directions. For both methods of feature reduction the appropriate word recognition algorithms are presented. In order to find the best solution, the Viterbi algorithm is used. A number of experiments were carried out to test the properties of the proposed methods of feature reduction. The experiment results are presented and concluded in the end of the paper.