Fast kernel entropy estimation and optimization

  • Authors:
  • Sarit Shwartz;Michael Zibulevsky;Yoav Y. Schechner

  • Affiliations:
  • Department of Electrical Engineering, Technion--Israel Institute of Technology, Haifa, Israel;Department of Electrical Engineering, Technion--Israel Institute of Technology, Haifa, Israel;Department of Electrical Engineering, Technion--Israel Institute of Technology, Haifa, Israel

  • Venue:
  • Signal Processing - Special issue: Information theoretic signal processing
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Differential entropy is a quantity used in many signal processing problems. Often we need to calculate not only the entropy itself, but also its gradient with respect to various variables, for efficient optimization, sensitivity analysis, etc. Entropy estimation can be based on an estimate of the probability density function, which is computationally costly if done naively. Some prior algorithms use computationally efficient non-parametric entropy estimators. However, differentiation of the previously proposed estimators is difficult and may even be undefined. To counter these obstacles, we consider non-parametric kernel entropy estimation that is differentiable. We present two different accelerated kernel algorithms. The first accelerates the entropy gradient calculation based on a back propagation principle. It allows calculating the differential entropy gradient in the same complexity as that of calculating the entropy itself. The second algorithm accelerates the estimation of both entropy and its gradient by using fast convolution over a uniform grid. As an example, we apply both algorithms to blind source separation.