The Laplacian Classifier

  • Authors:
  • R. Jenssen;D. Erdogmus;J.C. Principe;T. Eltoft

  • Affiliations:
  • Univ. of Tromso, Tromso;-;-;-

  • Venue:
  • IEEE Transactions on Signal Processing - Part I
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

We develop a novel classifier In a kernel feature space related to the eigenspectrum of the Laplacian data matrix. The classification cost function measures the angle between class mean vectors in the kernel feature space, and is derived from an information theoretic divergence measure using Parzen windowing. The classification rule is expressed in terms of a weighted kernel expansion. The weighting associated with a data point is inversely proportional to the probability density at that point, emphasizing the least probable regions. No optimization is needed to determine the weighting scheme, as opposed to the support vector machine. The connection to Parzen windowing also provides a theoretical criterion for kernel size selection, reducing the need for computationally demanding cross-validation. We show that the new classifier performs better than the Parzen window Bayes classifier, and in many cases comparable to the support vector machine, at a computationally lower cost.