A practical Bayesian framework for backpropagation networks
Neural Computation
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Principal component neural networks: theory and applications
Principal component neural networks: theory and applications
Hyperparameter selection for self-organizing maps
Neural Computation
Self-organizing maps
GTM: the generative topographic mapping
Neural Computation
On the Emulation of Kohonen's Self-Organization via Single-Map Metropolis-Hastings Algorithms
ICCS '01 Proceedings of the International Conference on Computational Science-Part II
Self-organizing mixture models
Neurocomputing
The elastic net as visual category representation: visualisation and classification
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part II
Hi-index | 0.00 |
In the statistical approach for self-organizing maps (SOMs), learning is regarded as an estimation algorithm for a gaussian mixture model with a gaussian smoothing prior on the centroid parameters. The values of the hyperparameters and the topological structure are selected on the basis of a statistical principle. However, since the component selection probabilities are fixed to a common value, the centroids concentrate on areas with high data density. This deforms a coordinate system on an extracted manifold and makes smoothness evaluation for the manifold inaccurate. In this article, we study an extended SOM model whose component selection probabilities are variable. To stabilize the estimation, a smoothing prior on the component selection probabilities is introduced. An estimation algorithm for the parameters and the hyperparameters based on empirical Bayesian inference is obtained. The performance of density estimation by the new model and the SOM model is compared via simulation experiments.