Mixtures of probabilistic principal component analyzers
Neural Computation
Proceedings of the 1998 conference on Advances in neural information processing systems II
Missing Value Estimation Using Mixture of PCAs
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Online Model Selection Based on the Variational Bayes
Neural Computation
Inferring parameters and structure of latent variable models by variational bayes
UAI'99 Proceedings of the Fifteenth conference on Uncertainty in artificial intelligence
A note on variational Bayesian factor analysis
Neural Networks
Variational Bayesian mixture model on a subspace of exponential family distributions
IEEE Transactions on Neural Networks
Automatic model selection by cross-validation for probabilistic PCA
Neural Processing Letters
Automatic model selection for probabilistic PCA
IWANN'07 Proceedings of the 9th international work conference on Artificial neural networks
Hi-index | 0.00 |
Bayesian PCA (BPCA) provides a Bayes inference for probabilistic PCA, in which several prior distributions have been devised; for example, automatic relevance determination (ARD) is used for determining the dimensionality. However, there is arbitrariness in prior setting; different prior settings result in different estimations. This article aims at presenting a standard setting of prior distribution for BPCA. We first define a general hierarchical prior for BPCA and show an exact predictive distribution. We show that several of the already proposed priors can be regarded as special cases of the general prior. By comparing various priors, we show that BPCA with nearly non-informative hierarchical priors exhibits the best performance.