A Bayesian analysis of self-organizing maps
Neural Computation
A theory of self-organising neural networks
MANNA '95 Proceedings of the first international conference on Mathematics of neural networks : models, algorithms and applications: models, algorithms and applications
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
In this paper the theory of unsupervised multi-layer stochastic vector quantiser (SVQ) networks is reviewed, and then extended to the supervised case where the network is to be used as a classifier. This leads to a hybrid approach, in which training is governed both by unsupervised and supervised pieces in the network objective function. The unsupervised piece aims to preserve enough information in the network to be able to accurately reconstruct the input (i.e. the network serves as an encoder), whereas the supervised piece aims to reproduce the classification output supplied by an external teacher (i.e. the network serves as a classifier). The tension between these two pieces of the objective function leads to an optimal network, in which typically the lower layers (near to the input) act as faithful encoders of the input, whereas the higher layers (near to the output) act as faithful classifiers. The results of some simulations are presented to illustrate these properties.