A new information theoretic analysis of sum-of-squared-error kernel clustering

  • Authors:
  • Robert Jenssen;Torbjørn Eltoft

  • Affiliations:
  • Department of Physics and Technology, University of Tromsø, N-9037 Tromsø, Norway;Department of Physics and Technology, University of Tromsø, N-9037 Tromsø, Norway

  • Venue:
  • Neurocomputing
  • Year:
  • 2008

Quantified Score

Hi-index 0.02

Visualization

Abstract

The contribution of this paper is to provide a new input space analysis of the properties of sum-of-squared-error K-means clustering performed in a Mercer kernel feature space. Such an analysis has been missing until now, even though kernel K-means has been popular in the clustering literature. Our derivation extends the theory of traditional K-means from properties of mean vectors to information theoretic properties of Parzen window estimated probability density functions (pdfs). In particular, Euclidean distance-based kernel K-means is shown to maximize an integrated squared error divergence measure between cluster pdfs and the overall pdf of the data, while a cosine similarity-based approach maximizes a Cauchy-Schwarz divergence measure. Furthermore, the iterative rules which assign data points to clusters in order to maximize these criteria are shown to depend on the cluster pdfs evaluated at the data points, in addition to the Renyi entropies of the clusters. The Bayes rule is shown to be a special case.