A Classification EM algorithm for clustering and two stochastic versions
Computational Statistics & Data Analysis - Special issue on optimization techniques in statistics
A Bayesian analysis of self-organizing maps
Neural Computation
GTM: the generative topographic mapping
Neural Computation
Deterministic annealing EM algorithm
Neural Networks
Generative probability density model in the self-organizing map
Self-Organizing neural networks
Self-Organizing Maps
Joint entropy maximization in kernel-based topographic maps
Neural Computation
A unified framework for model-based clustering
The Journal of Machine Learning Research
Maximum Likelihood Topographic Map Formation
Neural Computation
SMEM Algorithm for Mixture Models
Neural Computation
Convergence and Ordering of Kohonen's Batch Map
Neural Computation
EURASIP Journal on Applied Signal Processing
Bayesian Regularization for Normal Mixture Estimation and Model-Based Clustering
Journal of Classification
Self-organizing mixture models
Neurocomputing
Yet another algorithm which can generate topography map
IEEE Transactions on Neural Networks
Self-organizing maps, vector quantization, and mixture modeling
IEEE Transactions on Neural Networks
PRSOM: a new visualization method by hybridizing multidimensional scaling and self-organizing map
IEEE Transactions on Neural Networks
Probabilistic self-organizing maps for qualitative data
Neural Networks
Probabilistic self-organizing maps for continuous data
IEEE Transactions on Neural Networks
UNN: a neural network for uncertain data classification
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part I
Hi-index | 0.00 |
In this paper, we consider the learning process of a probabilistic self-organizing map (PbSOM) as a model-based data clustering procedure that preserves the topological relationships between data clusters in a neural network. Based on this concept, we develop a coupling-likelihood mixture model for the PbSOM that extends the reference vectors in Kohonen's self-organizing map (SOM) to multivariate Gaussian distributions. We also derive three expectation-maximization (EM)-type algorithms, called the SOCEM, SOEM, and SODAEM algorithms, for learning the model (PbSOM) based on the maximum-likelihood criterion. SOCEM is derived by using the classification EM (CEM) algorithm to maximize the classification likelihood; SOEM is derived by using the EM algorithm to maximize the mixture likelihood; and SODAEM is a deterministic annealing (DA) variant of SOCEM and SOEM. Moreover, by shrinking the neighborhood size, SOCEM and SOEM can be interpreted, respectively, as DA variants of the CEM and EM algorithms for Gaussian model-based clustering. The experimental results show that the proposed PbSOM learning algorithms achieve comparable data clustering performance to that of the deterministic annealing EM (DAEM) approach, while maintaining the topology-preserving property.