Small Sample Size Effects in Statistical Pattern Recognition: Recommendations for Practitioners
IEEE Transactions on Pattern Analysis and Machine Intelligence
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Method to Estimate the True Mahalanobis Distance from Eigenvectors of Sample Covariance Matrix
Proceedings of the Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
A New Robust Quadratic Discriminant Function
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 1 - Volume 1
Growing Gaussian Mixture Models for Pose Invariant Face Recognition
ICPR '00 Proceedings of the International Conference on Pattern Recognition - Volume 1
Translation Initiation Sites Prediction with Mixture Gaussian Models in Human cDNA Sequences
IEEE Transactions on Knowledge and Data Engineering
Application of the Gaussian mixture model to drug dissolution profiles prediction
Neural Computing and Applications
A particular Gaussian mixture model for clustering and its application to image retrieval
Soft Computing - A Fusion of Foundations, Methodologies and Applications - Special issue on neural networks for pattern recognition and data mining
IEICE - Transactions on Information and Systems
Hi-index | 0.00 |
In statistical pattern recognition, a Gaussian mixture model is sometimes used for representing the distribution of vectors. The parameters of the Gaussian mixture model are usually estimated from given sample data by the expectation maximization algorithm. However, when the number of data attributes is large, the parameters cannot be estimated correctly. In this paper, we propose a novel approach for estimating the parameters of the Gaussian mixture model by using sample data located on the boundary of regions defined by the component density functions. Experiments are carried out to show the characteristics of the proposed method.