Local models and Gaussian mixture models for statistical data processing
Local models and Gaussian mixture models for statistical data processing
Probabilistic Visual Learning for Object Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Mixtures of probabilistic principal component analyzers
Neural Computation
Efficient greedy learning of Gaussian mixture models
Neural Computation
Learning Mixtures of Gaussians
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Resolution-Based Complexity Control for Gaussian Mixture Models
Neural Computation
Modeling the manifolds of images of handwritten digits
IEEE Transactions on Neural Networks
Hi-index | 0.10 |
In recent years, a number of mixtures of local PCA models have been proposed. Most of these models require the user to set the number of submodels (local models) in the mixture and the dimensionality of the submodels (i.e., number of PC's) as well. To make the model free of these parameters, we propose a greedy expectation-maximization algorithm to find a suboptimal number of submodels. For a given retained variance ratio, the proposed algorithm estimates for each submodel the dimensionality that retains this given variability ratio. We test the proposed method on two different classification problems: handwritten digit recognition and 2-class ionosphere data classification. The results show that the proposed method has a good performance.