The nature of statistical learning theory
The nature of statistical learning theory
Beyond second-order statistics for learning: A pairwise interaction model for entropy estimation
Natural Computing: an international journal
Learning from Examples with Information Theoretic Criteria
Journal of VLSI Signal Processing Systems
GTM: A Principled Alternative to the Self-Organizing Map
ICANN 96 Proceedings of the 1996 International Conference on Artificial Neural Networks
Kernel-based topographic map formation achieved with an information-theoretic approach
Neural Networks - New developments in self-organizing maps
Yet another algorithm which can generate topography map
IEEE Transactions on Neural Networks
Self-organizing mixture networks for probability density estimation
IEEE Transactions on Neural Networks
Generalized information potential criterion for adaptive system training
IEEE Transactions on Neural Networks
2005 Special Issue: Unifying cost and information in information-theoretic competitive learning
Neural Networks - 2005 Special issue: IJCNN 2005
Fuzzy classification by fuzzy labeled neural gas
Neural Networks - 2006 Special issue: Advances in self-organizing maps--WSOM'05
Feature Discovery by Enhancement and Relaxation of Competitive Units
IDEAL '08 Proceedings of the 9th International Conference on Intelligent Data Engineering and Automated Learning
Enhancing and Relaxing Competitive Units for Feature Discovery
Neural Processing Letters
Combining Feature Selection and Local Modelling in the KDD Cup 99 Dataset
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
A new supervised local modelling classifier based on information theory
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Median fuzzy c-means for clustering dissimilarity data
Neurocomputing
Energy Supervised Relevance Neural Gas for Feature Ranking
Neural Processing Letters
Divergence based online learning in vector quantization
ICAISC'10 Proceedings of the 10th international conference on Artificial intelligence and soft computing: Part I
Efficiency of local models ensembles for time series prediction
Expert Systems with Applications: An International Journal
Divergence-based vector quantization
Neural Computation
Prototype based classification using information theoretic learning
ICONIP'06 Proceedings of the 13th international conference on Neural Information Processing - Volume Part II
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Hi-index | 0.01 |
The process of representing a large data set with a smaller number of vectors in the best possible way, also known as vector quantization, has been intensively studied in the recent years. Very efficient algorithms like the Kohonen self-organizing map (SOM) and the Linde Buzo Gray (LBG) algorithm have been devised. In this paper a physical approach to the problem is taken, and it is shown that by considering the processing elements as points moving in a potential field an algorithm equally efficient as the before mentioned can be derived. Unlike SOM and LBG this algorithm has a clear physical interpretation and relies on minimization of a well defined cost function. It is also shown how the potential field approach can be linked to information theory by use of the Parzen density estimator. In the light of information theory it becomes clear that minimizing the free energy of the system is in fact equivalent to minimizing a divergence measure between the distribution of the data and the distribution of the processing elements, hence, the algorithm can be seen as a density matching method.