Convergence properties and data efficiency of the minimum error entropy criterion in ADALINE training

  • Authors:
  • D. Erdogmus;J.C. Principe

  • Affiliations:
  • Electr. & Comput. Eng. Dept. NEB, Univ. of Florida, Gainesville, FL, USA;-

  • Venue:
  • IEEE Transactions on Signal Processing
  • Year:
  • 2003

Quantified Score

Hi-index 35.68

Visualization

Abstract

Recently, we have proposed the minimum error entropy (MEE) criterion as an information theoretic alternative to the widely used mean square error criterion in supervised adaptive system training. For this purpose, we have formulated a nonparametric estimator for Renyi's entropy that employs Parzen windowing. Mathematical investigation of the proposed entropy estimator revealed interesting insights about the process of information theoretical learning. This new estimator and the associated criteria have been applied to the supervised and unsupervised training of adaptive systems in a wide range of problems successfully. In this paper, we analyze the structure of the MEE performance surface around the optimal solution, and we derive the upper bound for the step size in adaptive linear neuron (ADALINE) training with the steepest descent algorithm using MEE. In addition, the effects of the entropy order and the kernel size in Parzen windowing on the shape of the performance surface and the eigenvalues of the Hessian at and around the optimal solution are investigated. Conclusions from the theoretical analyses are illustrated through numerical examples.