Learning SVMs from Sloppily Labeled Data

  • Authors:
  • Guillaume Stempfel;Liva Ralaivola

  • Affiliations:
  • Laboratoire d'Informatique Fondamentale de Marseille, Aix-Marseille Université,;Laboratoire d'Informatique Fondamentale de Marseille, Aix-Marseille Université,

  • Venue:
  • ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper proposes a modelling of Support Vector Machine (SVM) learning to address the problem of learning with sloppy labels . In binary classification, learning with sloppy labels is the situation where a learner is provided with labelled data, where the observed labels of each class are possibly noisy (flipped) version of their true class and where the probability of flipping a label y to ---y only depends on y . The noise probability is therefore constant and uniform within each class: learning with positive and unlabeled data is for instance a motivating example for this model. In order to learn with sloppy labels, we propose SloppySvm , an SVM algorithm that minimizes a tailored nonconvex functional that is shown to be a uniform estimate of the noise-free SVM functional. Several experiments validate the soundness of our approach.