Linear potential proximal support vector machines for pattern classification

  • Authors:
  • Reshma Khemchandani;Suresh Chandra

  • Affiliations:
  • Department of Mathematics, Indian Institute of Technology, New Delhi, India;Department of Mathematics, Indian Institute of Technology, New Delhi, India

  • Venue:
  • Optimization Methods & Software - Mathematical programming in data mining and machine learning
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Support vector machine (SVM) classifiers attempt to find a maximum margin hyperplane by solving a convex optimization problem. The conventional SVM approach involves the minimization of a quadratic function subject to linear inequality constraints. However, the margin is not scale invariant, and therefore a linear transformation of the data tends to affect the classification accuracy. Recently, potential SVMs attempted to address the issue of scale variance by using an appropriate scaling to improve the classification accuracy. In this paper, we propose a novel SVM formulation that is in the spirit of potential SVM, but requires a single matrix inversion to find the classifier. Experimental results bear out the efficacy of the classifier.