Weighted Mahalanobis Distance Kernels for Support Vector Machines

  • Authors:
  • Defeng Wang;D. S. Yeung;E. C. Tsang

  • Affiliations:
  • Chinese Univ. of Hong Kong, Shatin;-;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

The support vector machine (SVM) has been demonstrated to be a very effective classifier in many applications, but its performance is still limited as the data distribution information is underutilized in determining the decision hyperplane. Most of the existing kernels employed in nonlinear SVMs measure the similarity between a pair of pattern images based on the Euclidean inner product or the Euclidean distance of corresponding input patterns, which ignores data distribution tendency and makes the SVM essentially a ldquolocalrdquo classifier. In this paper, we provide a step toward a paradigm of kernels by incorporating data specific knowledge into existing kernels. We first find the data structure for each class adaptively in the input space via agglomerative hierarchical clustering (AHC), and then construct the weighted Mahalanobis distance (WMD) kernels using the detected data distribution information. In WMD kernels, the similarity between two pattern images is determined not only by the Mahalanobis distance (MD) between their corresponding input patterns but also by the sizes of the clusters they reside in. Although WMD kernels are not guaranteed to be positive definite (pd) or conditionally positive definite (cpd), satisfactory classification results can still be achieved because regularizers in SVMs with WMD kernels are empirically positive in pseudo-Euclidean (pE) spaces. Experimental results on both synthetic and real-world data sets show the effectiveness of ldquopluggingrdquo data structure into existing kernels.