EM algorithms for PCA and SPCA
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
A view of the EM algorithm that justifies incremental, sparse, and other variants
Learning in graphical models
Mixtures of probabilistic principal component analyzers
Neural Computation
Robust mixture modelling using the t distribution
Statistics and Computing
Robust probabilistic projections
ICML '06 Proceedings of the 23rd international conference on Machine learning
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
A Nonlinear Mapping for Data Structure Analysis
IEEE Transactions on Computers
Minimum effective dimension for mixtures of subspaces: a robust GPCA algorithm and its applications
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Modeling the manifolds of images of handwritten digits
IEEE Transactions on Neural Networks
Robust principal component analysis by self-organizing rules based on statistical physics approach
IEEE Transactions on Neural Networks
Designing Model Based Classifiers by Emphasizing Soft Targets
Fundamenta Informaticae - Advances in Artificial Intelligence and Applications
A fast algorithm for robust mixtures in the presence of measurement errors
IEEE Transactions on Neural Networks
Robust mixture clustering using Pearson type VII distribution
Pattern Recognition Letters
Extracting coactivated features from multiple data sets
ICANN'11 Proceedings of the 21th international conference on Artificial neural networks - Volume Part I
Designing Model Based Classifiers by Emphasizing Soft Targets
Fundamenta Informaticae - Advances in Artificial Intelligence and Applications
Human action recognition in video by fusion of structural and spatio-temporal features
SSPR'12/SPR'12 Proceedings of the 2012 Joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
Color-to-gray based on chance of happening preservation
Neurocomputing
Hi-index | 0.01 |
Mixtures of probabilistic principal component analyzers model high-dimensional nonlinear data by combining local linear models. Each mixture component is specifically designed to extract the local principal orientations in the data. An important issue with this generative model is its sensitivity to data lying off the low-dimensional manifold. In order to address this problem, the mixtures of robust probabilistic principal component analyzers are introduced. They take care of atypical points by means of a long tail distribution, the Student-t. It is shown that the resulting mixture model is an extension of the mixture of Gaussians, suitable for both robust clustering and dimensionality reduction. Finally, we briefly discuss how to construct a robust version of the closely related mixture of factor analyzers.