Unsupervised Selection and Discriminative Estimation of Orthogonal Gaussian Mixture Models for Handwritten Digit Recognition

  • Authors:
  • Xuefeng Chen;Xiabi Liu;Yunde Jia

  • Affiliations:
  • -;-;-

  • Venue:
  • ICDAR '09 Proceedings of the 2009 10th International Conference on Document Analysis and Recognition
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The problem of determining the appropriate number of components is important in finite mixture modeling for pattern classification. This paper considers the application of an unsupervised clustering method called AutoClass to training of Orthogonal Gaussian Mixture Models (OGMM). Actually, the number of components in OGMM of each class is selected based on AutoClass. In this way, the structures of OGMM for difference classes are not necessarily be the same as those in usual modeling scheme, so that the dissimilarity between the data distributions of different classes can be described more exactly. After the model selection is completed, a discriminative learning framework of Bayesian classifiers called Max-Min posterior pseudo-probabilities (MMP) is employed to estimate component parameters in OGMM of each class. We apply the proposed learning approach of OGMM to handwritten digit recognition. The experimental results on the MNIST database show the effectiveness of our approach.