Softening the margin in discrete SVM

  • Authors:
  • Carlotta Orsenigo;Carlo Vercellis

  • Affiliations:
  • Dip. di Scienze Economiche, Aziendali e Statistiche, Università di Milano, Italy;Dip. di Ingegneria Gestionale, Politecnico di Milano, Italy

  • Venue:
  • ICDM'07 Proceedings of the 7th industrial conference on Advances in data mining: theoretical aspects and applications
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Discrete support vector machines are models for classification recently introduced in the context of statistical learning theory. Their distinctive feature is the formulation of mixed integer programming problems aimed at deriving optimal separating hyperplanes with minimum empirical error and maximum generalization capability. A new family of discrete SVM is proposed in this paper, for which the hyperplane establishes a variable softening of the margin to improve the separation among distinct classes. Theoretical bounds are derived to finely tune the parameters of the optimization problem. Computational tests on benchmark datasets in the biolife science application domain indicate the effectiveness of the proposed approach, that appears dominating against traditional SVM in terms of accuracy and percentage of support vectors.