Feature Weighting Using Margin and Radius Based Error Bound Optimization in SVMs

  • Authors:
  • Huyen Do;Alexandros Kalousis;Melanie Hilario

  • Affiliations:
  • Computer Science Department, University of Geneva, Carouge, Switzerland 1227;Computer Science Department, University of Geneva, Carouge, Switzerland 1227;Computer Science Department, University of Geneva, Carouge, Switzerland 1227

  • Venue:
  • ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part I
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

The Support Vector Machine error bound is a function of the margin and radius. Standard SVM algorithms maximize the margin within a given feature space, therefore the radius is fixed and thus ignored in the optimization. We propose an extension of the standard SVM optimization in which we also account for the radius in order to produce an even tighter error bound than what we get by controlling only for the margin. We use a second set of parameters, μ , that control the radius introducing like that an explicit feature weighting mechanism in the SVM algorithm. We impose an l 1 constraint on μ which results in a sparse vector, thus performing feature selection. Our original formulation is not convex, we give a convex approximation and show how to solve it. We experiment with real world datasets and report very good predictive performance compared to standard SVM.