Learning from Examples with Information Theoretic Criteria

  • Authors:
  • Jose C. Principe;Dongxin Xu;Qun Zhao;John W. Fisher, III

  • Affiliations:
  • Computational NeuroEngineering Laboratory, University of Florida, Gainesville, FL 32611, USA;Computational NeuroEngineering Laboratory, University of Florida, Gainesville, FL 32611, USA;Computational NeuroEngineering Laboratory, University of Florida, Gainesville, FL 32611, USA;Computational NeuroEngineering Laboratory, University of Florida, Gainesville, FL 32611, USA

  • Venue:
  • Journal of VLSI Signal Processing Systems
  • Year:
  • 2000

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper discusses a framework for learning based on information theoretic criteria. A novel algorithm based on Renyi's quadratic entropy is used to train, directly from a data set, linear or nonlinear mappers for entropy maximization or minimization. We provide an intriguing analogy between the computation and an information potential measuring the interactions among the data samples. We also propose two approximations to the Kulback-Leibler divergence based on quadratic distances (Cauchy-Schwartz inequality and Euclidean distance). These distances can still be computed using the information potential. We test the newly proposed distances in blind source separation (unsupervised learning) and in feature extraction for classification (supervised learning). In blind source separation our algorithm is capable of separating instantaneously mixed sources, and for classification the performance of our classifier is comparable to the support vector machines (SVMs).