Combining labeled and unlabeled data with co-training
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Training a Support Vector Machine in the Primal
Neural Computation
Hi-index | 0.00 |
This paper proposes a modelling of Support Vector Machine (SVM) learning to address the problem of learning with sloppy labels . In binary classification, learning with sloppy labels is the situation where a learner is provided with labelled data, where the observed labels of each class are possibly noisy (flipped) version of their true class and where the probability of flipping a label y to ---y only depends on y . The noise probability is therefore constant and uniform within each class: learning with positive and unlabeled data is for instance a motivating example for this model. In order to learn with sloppy labels, we propose SloppySvm , an SVM algorithm that minimizes a tailored nonconvex functional that is shown to be a uniform estimate of the noise-free SVM functional. Several experiments validate the soundness of our approach.