Parsimonious Mahalanobis kernel for the classification of high dimensional data

  • Authors:
  • M. Fauvel;J. Chanussot;J. A. Benediktsson;A. Villa

  • Affiliations:
  • INRA, DYNAFOR, BP 32607, Auzeville-Tolosane 31326, Castanet Tolosan, France;GIPSA-lab, Departement Image Signal, BP 46 - 38402 Saint Martin d'Hères, France;Department of Electrical and Computer Engineering, University of Iceland Hjardarhagi 2-6, 107 Reykjavik, Iceland;Aresys srl, via Bistolfi 49, 20134 Milano and Dipartimento di Elettronica ed Informazione, Politecnico di Milano, 20133 Milano, Italy

  • Venue:
  • Pattern Recognition
  • Year:
  • 2013

Quantified Score

Hi-index 0.01

Visualization

Abstract

The classification of high dimensional data with kernel methods is considered in this paper. Exploiting the emptiness property of high dimensional spaces, a kernel based on the Mahalanobis distance is proposed. The computation of the Mahalanobis distance requires the inversion of a covariance matrix. In high dimensional spaces, the estimated covariance matrix is ill-conditioned and its inversion is unstable or impossible. Using a parsimonious statistical model, namely the High Dimensional Discriminant Analysis model, the specific signal and noise subspaces are estimated for each considered class making the inverse of the class specific covariance matrix explicit and stable, leading to the definition of a parsimonious Mahalanobis kernel. A SVM based framework is used for selecting the hyperparameters of the parsimonious Mahalanobis kernel by optimizing the so-called radius-margin bound. Experimental results on three high dimensional data sets show that the proposed kernel is suitable for classifying high dimensional data, providing better classification accuracies than the conventional Gaussian kernel.