Bayes Error Estimation Using Parzen and k-NN Procedures
IEEE Transactions on Pattern Analysis and Machine Intelligence
Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
Handwritten numerical recognition based on multiple algorithms
Pattern Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Toward Bayes-Optimal Linear Dimension Reduction
IEEE Transactions on Pattern Analysis and Machine Intelligence
A New Quadratic Classifier Applied to Biometric Recognition
ECCV '02 Proceedings of the International ECCV 2002 Workshop Copenhagen on Biometric Authentication
Handwritten character recognition system using a simple feature
Proceedings of the International Conference on Advances in Computing, Communications and Informatics
Hi-index | 0.00 |
For many statistical pattern recognition methods, distributions of sample vectors are assumed to be normal, and the quadratic discriminant function derived from the probability density function of multivariate normal distribution is used for classification. However, the computational cost is O(n2) for n-dimensional vectors. Moreover, if there are not enough training sample patterns, covariance matrix can not be estimated accurately. In the case that the dimensionality is large, these disadvantages markedly reduce classification performance. In order to avoid these problems, in this paper, a new approximation method of the quadratic discriminant function is proposed. This approximation is done by replacing the values of small eigenvalues by a constant which is estimated by the maximum likelihood estimation. This approximation not only reduces the computational cost but also improves the classification accuracy.