Soft Nearest Convex Hull Classifier

  • Authors:
  • Georgi Nalbantov;Evgueni Smirnov

  • Affiliations:
  • Department of Knowledge Engineering, Faculty of Humanities and Sciences, Maastricht University, Netherlands, email: {g.nalbantov,smirnov}@maastrichtuniversity.nl;Department of Knowledge Engineering, Faculty of Humanities and Sciences, Maastricht University, Netherlands, email: {g.nalbantov,smirnov}@maastrichtuniversity.nl

  • Venue:
  • Proceedings of the 2010 conference on ECAI 2010: 19th European Conference on Artificial Intelligence
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Consider the classification task of assigning a test instance to one of two or more possible classes. An intuitive way to proceed is to assign the instance to that class, to which the distance is minimal. If one considers the distance to the convex hull of a class as a distance measure, then the resulting classification method is the Nearest Convex Hull (NCH) classifier. There are two key issues with this method per se that severely restrict its applicability, which we solve in this paper: first, how to handle class overlap, and second, how to provide (nonlinear) solutions with better generalization ability. The first problem is handled via using so-called kernel functions and slack variables. The second problem is dealt with using a penalization term that suppresses too complex solutions. We call the resulting method the soft-NCH classifier. In spirit and computationally the method is close to the popular Support Vector Machine (SVM) classifier and can be viewed as an instance-based large-margin classification technique. Advantages of the soft-NCH classifier include its robustness to outliers, good generalization ability and naturally easy handling of multi-class problems. We compare the performance of soft-NCH against state-of-art techniques and report promising results.