Single-Class Classification with Mapping Convergence

  • Authors:
  • Hwanjo Yu

  • Affiliations:
  • Department of Computer Science, University of Iowa, Iowa City, USA 52242

  • Venue:
  • Machine Learning
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Single-Class Classification (SCC) seeks to distinguish one class of data from universal set of multiple classes. We call the target class positive and the complement set of samples negative. In SCC problems, it is assumed that a reasonable sample of the negative data is not available. SCC problems are prevalent in the real world where positive and unlabeled data are widely available but negative data are hard or expensive to acquire. We present an SCC algorithm called Mapping Convergence (MC) that computes an accurate boundary of the target class from positive and unlabeled data (without labeled negative data). The basic idea of MC is to exploit the natural "gap" between positive and negative data by incrementally labeling negative data from the unlabeled data using the margin maximization property of SVM. We also present Support Vector Mapping Convergence (SVMC) which optimizes the MC algorithm for fast training. Our analyses show that MC and SVMC without labeled negative data significantly outperform other SCC methods. They generate as accurate boundaries as standard SVM with fully labeled data when the positive data is not very under-sampled and there exist gaps between positive and negative classes in the feature space. Our results also show that SVMC trains much faster than MC with very close accuracy.