Unifying Subspace and Distance Metric Learning with Bhattacharyya Coefficient for Image Classification

  • Authors:
  • Qingshan Liu;Dimitris N. Metaxas

  • Affiliations:
  • Department of Computer Sciences, Rutgers, the State University of New Jersey, Piscataway 08854-8019;Department of Computer Sciences, Rutgers, the State University of New Jersey, Piscataway 08854-8019

  • Venue:
  • Emerging Trends in Visual Computing
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we propose a unified scheme of subspace and distance metric learning under the Bayesian framework for image classification. According to the local distribution of data, we divide the k -nearest neighbors of each sample into the intra-class set and the inter-class set, and we aim to learn a distance metric in the embedding subspace, which can make the distances between the sample and its intra-class set smaller than the distances between it and its inter-class set. To reach this goal, we consider the intra-class distances and the inter-class distances to be from two different probability distributions respectively, and we model the goal with minimizing the overlap between two distributions. Inspired by the Bayesian classification error estimation, we formulate the objective function by minimizing the Bhattachyrra coefficient between two distributions. We further extend it with the kernel trick to learn nonlinear distance metric. The power and generality of the proposed approach are demonstrated by a series of experiments on the CMU-PIE face database, the extended YALE face database, and the COREL-5000 nature image database.