Mixture density modeling, Kullback-Leibler divergence, and differential log-likelihood

  • Authors:
  • Marc M. Van Hulle

  • Affiliations:
  • K. U. Leuven, Laboratorium voor Neuro- en Psychofysiologie, Herestraat, Leuven, Belgium

  • Venue:
  • Signal Processing - Special issue: Information theoretic signal processing
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

A new log-likelihood (LL) based metric for goodness-of-fit testing and monitoring unsupervised learning of mixture densities is introduced, called differential LL. We develop the metric in the case of a Gaussian kernel fitted to a Gaussian distribution. We suggest a possible differential LL learning strategy, show the formal link with the Kullback-Leibler divergence and the quantization error, and introduce a Gaussian factorial distribution approximation by subspaces.