Mean shift: An information theoretic perspective

  • Authors:
  • Sudhir Rao;Allan de Medeiros Martins;José C. Príncipe

  • Affiliations:
  • Computational Neuroengineering Laboratory (CNEL), Department of ECE, University of Florida, Gainesville, FL 32611, USA;Department of Automation and Computer Engineering Technology Center, Federal University of Rio Grande do Norte, Nadal, RN 59078-900, Brazil;Computational Neuroengineering Laboratory (CNEL), Department of ECE, University of Florida, Gainesville, FL 32611, USA

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2009

Quantified Score

Hi-index 0.10

Visualization

Abstract

This paper develops a new understanding of mean shift algorithms from an information theoretic perspective. We show that the Gaussian blurring mean shift (GBMS) directly minimizes the Renyi's quadratic entropy of the dataset and hence is unstable by definition. Further, its stable counterpart, the Gaussian mean shift (GMS), minimizes the Renyi's ''cross'' entropy where the local stationary solutions are modes of the dataset. By doing so, we aptly answer the question ''What does mean shift algorithms optimize?'', thus highlighting naturally the properties of these algorithms. A consequence of this new understanding is the superior performance of GMS over GBMS which we show in a wide variety of applications ranging from mode finding to clustering and image segmentation.