Quadratic programming formulations for classificationand regression

  • Authors:
  • Robin C. Gilbert;Theodore B. Trafalis

  • Affiliations:
  • Laboratory of Optimization and Intelligent Systems, School of Industrial Engineering, University of Oklahoma, Norman, OK, USA;Laboratory of Optimization and Intelligent Systems, School of Industrial Engineering, University of Oklahoma, Norman, OK, USA

  • Venue:
  • Optimization Methods & Software - THE JOINT EUROPT-OMS CONFERENCE ON OPTIMIZATION, 4-7 JULY, 2007, PRAGUE, CZECH REPUBLIC, PART II
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We reformulate the support vector machine approach to classification and regression problems using a different methodology than the classical 'largest margin' paradigm. From this, we are able to derive extremely simple quadratic programming problems that allow for general symbolic solutions to the classical problems of geometric classification and regression. We obtain a new class of learning machines that are also robust to the presence of small perturbations and/or corrupted or missing data in the training sets (provided that information about the amplitude of the perturbations is known approximately). A high performance framework for very large-scale classification and regression problems based on a Voronoi tessellation of the input space is also introduced in this work. Our approach has been tested on seven benchmark databases with noticeable gain in computational time in comparison with standard decomposition techniques such as SVMlight.