A practical Bayesian framework for backpropagation networks
Neural Computation
GTM: the generative topographic mapping
Neural Computation
Bayesian Sampling and Ensemble Learning in Generative Topographic Mapping
Neural Processing Letters
Self-Organizing Maps
Bayesian parameter estimation via variational methods
Statistics and Computing
Variational Gaussian process classifiers
IEEE Transactions on Neural Networks
Selective smoothing of the generative topographic mapping
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Generative Topographic Mapping (GTM) is a non-linear latent variable model that provides simultaneous visualization and clustering of high-dimensional data. It was originally formulated as a constrained mixture of distributions, for which the adaptive parameters were determined by Maximum Likelihood (ML), using the Expectation-Maximization (EM) algorithm. In this paper, we define an alternative variational formulation of GTM that provides a full Bayesian treatment to a Gaussian Process (GP)-based variation of GTM. The performance of the proposed Variational GTM is assessed in several experiments with artificial datasets. These experiments highlight the capability of Variational GTM to avoid data overfitting through active regularization.