Estimating the information potential with the fast gauss transform

  • Authors:
  • Seungju Han;Sudhir Rao;Jose Principe

  • Affiliations:
  • CNEL, Department of Electrical and Computer Engineering, University of Florida, Gainesville;CNEL, Department of Electrical and Computer Engineering, University of Florida, Gainesville;CNEL, Department of Electrical and Computer Engineering, University of Florida, Gainesville

  • Venue:
  • ICA'06 Proceedings of the 6th international conference on Independent Component Analysis and Blind Signal Separation
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we propose a fast and accurate approximation to the information potential of Information Theoretic Learning (ITL) using the Fast Gauss Transform (FGT). We exemplify here the case of the Minimum Error Entropy criterion to train adaptive systems. The FGT reduces the complexity of the estimation from O(N2) to O(pkN) wherep is the order of the Hermite approximation and k the number of clusters utilized in FGT. Further, we show that FGT converges to the actual entropy value rapidly with increasing order p unlike the Stochastic Information Gradient, the present O(pN) approximation to reduce the computational complexity in ITL. We test the performance of these FGT methods on System Identification with encouraging results.