A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Neural Computation
A unified classification model based on robust optimization
Neural Computation
Conjugate relation between loss functions and uncertainty sets in classification problems
The Journal of Machine Learning Research
Hi-index | 0.00 |
The ν-support vector classification (ν-SVC) algorithm was shown to work well and provide intuitive interpretations, e.g., the parameter ν roughly specifies the fraction of support vectors. Although ν corresponds to a fraction, it cannot take the entire range between 0 and 1 in its original form. This problem was settled by a non-convex extension of ν-SVC and the extended method was experimentally shown to generalize better than original ν-SVC. However, its good generalization performance and convergence properties of the optimization algorithm have not been studied yet. In this paper, we provide new theoretical insights into these issues and propose a novel ν-SVC algorithm that has guaranteed generalization performance and convergence properties.