Entropy minimization for supervised digital communications channelequalization

  • Authors:
  • I. Santamaria;D. Erdogmus;J.C. Principe

  • Affiliations:
  • DICOM, Cantabria Univ., Santander;-;-

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2002

Quantified Score

Hi-index 35.69

Visualization

Abstract

This paper investigates the application of error-entropy minimization algorithms to digital communications channel equalization. The pdf of the error between the training sequence and the output of the equalizer is estimated using the Parzen windowing method with a Gaussian kernel, and then, the Renyi's quadratic entropy is minimized using a gradient descent algorithm. By estimating Renyi's entropy over a short sliding window, an online training algorithm is also introduced. Moreover, for a linear equalizer, an orthogonality condition for the minimum entropy solution that leads to an alternative fixed-point iterative minimization method is derived. The performance of linear and nonlinear equalizers trained with entropy and mean square error (MSE) is compared. As expected, the results of training a linear equalizer are very similar for both criteria since, even if the input noise is non-Gaussian, the output filtered noise tends to be Gaussian. On the other hand, for nonlinear channels and using a multilayer perceptron (MLP) as the equalizer, differences between both criteria appear. Specifically, it is shown that the additional information used by the entropy criterion yields a faster convergence in comparison with the MSE