On the stationary state of Kohonen's self-organizing sensory mapping
Biological Cybernetics
A stochastic model of retinotopy: A self organizing process
Biological Cybernetics
Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Interpreting the Kohonen self-organizing feature map using contiguity-constrained clustering
Pattern Recognition Letters
Extending the Kohonen self-organizing map networks for clustering analysis
Computational Statistics & Data Analysis
Cybernetic Systems
Information Systems Research
The Effect of Sample Size on the Extended Self-Organizing Map Network for Market Segmentation
HICSS '05 Proceedings of the Proceedings of the 38th Annual Hawaii International Conference on System Sciences (HICSS'05) - Track 3 - Volume 03
An extended self-organizing map network for market segmentation: a telecommunication example
Decision Support Systems
Expert Systems with Applications: An International Journal
A data mining framework for detecting subscription fraud in telecommunication
Engineering Applications of Artificial Intelligence
Hi-index | 0.03 |
Kohonen's self-organizing map (SOM) network maps input data to a lower dimensional output map. The extended SOM network further groups the nodes on the output map into a user specified number of clusters. Kiang, Hu and Fisher used the extended SOM network for market segmentation and showed that the extended SOM provides better results than the statistical approach that reduces the dimensionality of the problem via factor analysis and then forms segments with cluster analysis. In this study, we examined the effect of sample size on the extended SOM compared to that on the factor/cluster approach. Two sampling schemes, one with random sampling and the other one with proportionate sampling were used. Comparisons were made using the correct classification rates between the two approaches at various sample sizes. Unlike statistical models, neural networks are not dependent on statistical assumptions. Thus, the results for neural network models are stable across sample sizes but sensitive to initial weights and model specifications.