Complementary Kernel Density Estimation

  • Authors:
  • Xu Miao;Ali Rahimi;Rajesh P. N. Rao

  • Affiliations:
  • Department of Computer Science and Engineering, University of Washington, Seattle, USA;Department of Computer Science and Engineering, University of Washington, Seattle, USA;Department of Computer Science and Engineering, University of Washington, Seattle, USA

  • Venue:
  • Pattern Recognition Letters
  • Year:
  • 2012

Quantified Score

Hi-index 0.10

Visualization

Abstract

Generative models for vision and pattern recognition have been overshadowed in recent years by powerful non-parametric discriminative models. These discriminative models can learn arbitrary decision boundaries between classes and have proved very effective in classification and detection problems. However, unlike generative models, they do not lend themselves naturally to more general vision tasks such as rendering novel images, de-noising, and in-painting. In this paper we introduce Complementary Kernel Density Estimation (CKDE), a new generative model that adopts many of the features of non-parametric discriminative models: (1) CKDE allows complex decision surfaces and arbitrary class conditional distributions to be learned, (2) it is easy to train because the log likelihood of the model is concave, so it has no local maxima, and (3) one can train its class conditional distributions jointly to share information among the different classes. We first demonstrate that CKDE is more accurate in benchmark classification tasks than a purely discriminative method such as the SVM. We then show that the posterior probability of class labels is more accurately estimated than kernelized logistic regression. Our other results demonstrate that partial images can be accurately classified by marginalizing unobserved pixels from the class conditional distributions, and missing parts of the image can be painted in using the learned generative model.