Vector quantization using information theoretic concepts

  • Authors:
  • Tue Lehn-Schiøler;Anant Hegde;Deniz Erdogmus;Jose C. Principe

  • Affiliations:
  • Intelligent Signal Processing, Informatics and Mathematical Modelling, Technical University of Denmark, Lyngby, Denmark;Computational NeuroEngineering Laboratory, Electrical and Computer Engineering Department, University of Florida, Gainesville, USA 32611;Computational NeuroEngineering Laboratory, Electrical and Computer Engineering Department, University of Florida, Gainesville, USA 32611;Computational NeuroEngineering Laboratory, Electrical and Computer Engineering Department, University of Florida, Gainesville, USA 32611

  • Venue:
  • Natural Computing: an international journal
  • Year:
  • 2005

Quantified Score

Hi-index 0.01

Visualization

Abstract

The process of representing a large data set with a smaller number of vectors in the best possible way, also known as vector quantization, has been intensively studied in the recent years. Very efficient algorithms like the Kohonen self-organizing map (SOM) and the Linde Buzo Gray (LBG) algorithm have been devised. In this paper a physical approach to the problem is taken, and it is shown that by considering the processing elements as points moving in a potential field an algorithm equally efficient as the before mentioned can be derived. Unlike SOM and LBG this algorithm has a clear physical interpretation and relies on minimization of a well defined cost function. It is also shown how the potential field approach can be linked to information theory by use of the Parzen density estimator. In the light of information theory it becomes clear that minimizing the free energy of the system is in fact equivalent to minimizing a divergence measure between the distribution of the data and the distribution of the processing elements, hence, the algorithm can be seen as a density matching method.