Beyond second-order statistics for learning: A pairwise interaction model for entropy estimation

  • Authors:
  • Deniz Erdogmus;Jose C. Principe;Kenneth E. Hild, II

  • Affiliations:
  • Computational NeuroEngineering Laboratory, Electrical & Computer Engineering Department, University of Florida, Gainesville, FL 32611, USA;Computational NeuroEngineering Laboratory, Electrical & Computer Engineering Department, University of Florida, Gainesville, FL 32611, USA;Computational NeuroEngineering Laboratory, Electrical & Computer Engineering Department, University of Florida, Gainesville, FL 32611, USA

  • Venue:
  • Natural Computing: an international journal
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Second order statistics have formed the basisof learning and adaptation due to its appealand analytical simplicity. On the other hand,in many realistic engineering problemsrequiring adaptive solutions, it is notsufficient to consider only the second orderstatistics of the underlying distributions. Entropy, being the average information contentof a distribution, is a better-suited criterionfor adaptation purposes, since it allows thedesigner to manipulate the information contentof the signals rather than merely their power. This paper introduces a nonparametric estimatorof Renyi's entropy, which can be utilized inany adaptation scenario where entropy plays arole. This nonparametric estimator leads to aninteresting analogy between learning andinteracting particles in a potential field. Itturns out that learning by second orderstatistics is a special case of thisinteraction model for learning. We investigatethe mathematical properties of thisnonparametric entropy estimator, provide batchand stochastic gradient expressions foroff-line and on-line adaptation, and illustratethe performance of the corresponding algorithmsin examples of supervised and unsupervisedtraining, including time-series prediction andICA.