Mixtures of probabilistic principal component analyzers
Neural Computation
An Introduction to Variational Methods for Graphical Models
Machine Learning
Proceedings of the 1998 conference on Advances in neural information processing systems II
Combining Multiple Representations and Classifiers for Pen-based Handwritten Digit Recognitio
ICDAR '97 Proceedings of the 4th International Conference on Document Analysis and Recognition
Cluster ensembles --- a knowledge reuse framework for combining multiple partitions
The Journal of Machine Learning Research
Model Selection for Unsupervised Learning of Visual Context
International Journal of Computer Vision
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
The Variational Bayes Method in Signal Processing (Signals and Communication Technology)
The Variational Bayes Method in Signal Processing (Signals and Communication Technology)
A New Approach of Data Clustering Using a Flock of Agents
Evolutionary Computation
Merging distributed database summaries
Proceedings of the sixteenth ACM conference on Conference on information and knowledge management
Variational Bayesian mixture model on a subspace of exponential family distributions
IEEE Transactions on Neural Networks
Distributed Algorithms for Topic Models
The Journal of Machine Learning Research
Aggregation of Probabilistic PCA Mixtures with a Variational-Bayes Technique Over Parameters
ICPR '10 Proceedings of the 2010 20th International Conference on Pattern Recognition
MultiClust 2010: discovering, summarizing and using multiple clusterings
ACM SIGKDD Explorations Newsletter
Hi-index | 0.00 |
Mixtures of probabilistic principal component analyzers (MPPCA) have shown effective for modeling high-dimensional data sets living on non-linear manifolds. Briefly stated, they conduct mixture model estimation and dimensionality reduction through a single process. This paper makes two contributions: first, we disclose a Bayesian technique for estimating such mixture models. Then, assuming several MPPCA models are available, we address the problem of aggregating them into a single MPPCA model, which should be as parsimonious as possible. We disclose in detail how this can be achieved in a cost-effective way, without sampling nor access to data, but solely requiring mixture parameters. The proposed approach is based on a novel variational-Bayes scheme operating over model parameters. Numerous experimental results and discussion are provided.