IWANN '01 Proceedings of the 6th International Work-Conference on Artificial and Natural Neural Networks: Connectionist Models of Neurons, Learning Processes and Artificial Intelligence-Part I
Journal of Cognitive Neuroscience
The impact of network topology on self-organizing maps
Proceedings of the first ACM/SIGEVO Summit on Genetic and Evolutionary Computation
Contextually self-organized maps of chinese words
WSOM'11 Proceedings of the 8th international conference on Advances in self-organizing maps
Improving the K-NN classification with the euclidean distance through linear data transformations
ICDM'04 Proceedings of the 4th international conference on Advances in Data Mining: applications in Image Mining, Medicine and Biotechnology, Management and Environmental Control, and Telecommunications
Dynamic self-organizing maps with controlled growth for knowledge discovery
IEEE Transactions on Neural Networks
Self-Organising maps for classification with metropolis-hastings algorithm for supervision
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part III
Hi-index | 0.00 |
Self-Organising Maps (SOM) are Artificial Neural Networks used in Pattern Recognition tasks. Their major advantage over other architectures is human readability of a model. However, they often gain poorer accuracy. Mostly used metric in SOM is the Euclidean distance, which is not the best approach to some problems. In this paper, we study an impact of the metric change on the SOM's performance in classification problems. In order to change the metric of the SOM we applied a distance metric learning method, so-called 'Large Margin Nearest Neighbour'. It computes the Mahalanobis matrix, which assures small distance between nearest neighbour points from the same class and separation of points belonging to different classes by large margin. Results are presented on several real data sets, containing for example recognition of written digits, spoken letters or faces.