A practical Bayesian framework for backpropagation networks
Neural Computation
Hyperparameter selection for self-organizing maps
Neural Computation
Self-organizing maps
GTM: the generative topographic mapping
Neural Computation
Density estimation by mixture models with smoothing priors
Neural Computation
Comparison of approximate methods for handling hyperparameters
Neural Computation
On the Emulation of Kohonen's Self-Organization via Single-Map Metropolis-Hastings Algorithms
ICCS '01 Proceedings of the International Conference on Computational Science-Part II
Evolving High-Posterior Self-Organizing Maps
IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
IDEAL'07 Proceedings of the 8th international conference on Intelligent data engineering and automated learning
Hi-index | 0.00 |
Generative topographic mapping (GTM) is a statistical model to extract a hidden smooth manifold from data, like the self-organizing map (SOM). Although a deterministic search algorithm for the hyperparameters regulating the smoothness of the manifold has been proposed previously, it is based on approximations that are valid only on abundant data. Thus, it often fails to obtain suitable estimates on small data. In this paper, to improve the hyperparameter search in GTM, we construct a Gibbs sampler on the model, which generates random sample series following the posteriors on the hyperparameters. Reliable estimates are obtained from the samples. In addition, we obtain another deterministic algorithm using the ensemble learning. From the result of an experimental comparison of these algorithms, an efficient method for reliable estimation in GTM is suggested.