EM algorithms for PCA and SPCA
NIPS '97 Proceedings of the 1997 conference on Advances in neural information processing systems 10
Mixtures of probabilistic principal component analyzers
Neural Computation
Proceedings of the 1998 conference on Advances in neural information processing systems II
Robust mixture modelling using the t distribution
Statistics and Computing
Robust probabilistic projections
ICML '06 Proceedings of the 23rd international conference on Machine learning
Minimum effective dimension for mixtures of subspaces: a robust GPCA algorithm and its applications
CVPR'04 Proceedings of the 2004 IEEE computer society conference on Computer vision and pattern recognition
Inferring parameters and structure of latent variable models by variational bayes
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
Robust principal component analysis by self-organizing rules based on statistical physics approach
IEEE Transactions on Neural Networks
Subspace clustering of high-dimensional data: a predictive approach
Data Mining and Knowledge Discovery
Hi-index | 0.00 |
Principal Component Analysis, when formulated as a probabilistic model, can be made robust to outliers by using a Student-t assumption on the noise distribution instead of a Gaussian one. On the other hand, mixtures of PCA is a model aimed to discover nonlinear dependencies in data by finding clusters and identifying local linear submanifolds. This paper shows how mixtures of PCA can be made robust to outliers too. Using a hierarchical probabilistic model, parameters are set by likelihood maximization. The method is shown to be effectively robust to outliers, even in the context of high-dimensional data.