Mixtures of probabilistic principal component analyzers
Neural Computation
Proceedings of the 1998 conference on Advances in neural information processing systems II
Mixtures of Local Linear Subspaces for Face Recognition
CVPR '98 Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
Modeling the manifolds of images of handwritten digits
IEEE Transactions on Neural Networks
Ensemble of Independent Factor Analyzers with Application to Natural Image Analysis
Neural Processing Letters
Modelling high-dimensional data by mixtures of factor analyzers
Computational Statistics & Data Analysis
Gaussian mixture learning via robust competitive agglomeration
Pattern Recognition Letters
IEEE Transactions on Signal Processing
Maximum likelihood estimation of mixtures of factor analyzers
Computational Statistics & Data Analysis
Mixtures of common factor analyzers for high-dimensional data with missing information
Journal of Multivariate Analysis
Hi-index | 0.01 |
For Bayesian inference on the mixture of factor analyzers, natural conjugate priors on the parameters are introduced, and then a Gibbs sampler that generates parameter samples following the posterior is constructed. In addition, a deterministic estimation algorithm is derived by taking modes instead of samples from the conditional posteriors used in the Gibbs sampler. This is regarded as a maximum a posteriori estimation algorithm with hyperparameter search. The behaviors of the Gibbs sampler and the deterministic algorithm are compared on a simulation experiment.