Properties of support vector machines
Neural Computation
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Support Vector Machines for Pattern Classification (Advances in Pattern Recognition)
Training ν-Support Vector Classifiers: Theory and Algorithms
Neural Computation
Sparseness vs Estimating Conditional Probabilities: Some Asymptotic Results
The Journal of Machine Learning Research
A fast iterative nearest point algorithm for support vector machine classifier design
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The goal of this paper is to announce some results dealing with mathematical properties of so-called L2 Soft-Margin Support Vector Machines (L2-SVMs) for data classification. Their dual formulations build a family of quadratic programming problems depending on one regularization parameter. The dependence of the solution on this parameter is examined. Such properties as continuity, differentiability, monotony and convexity are investigated. It is shown that the solution and the objective value of the Hard Margin SVM allow estimating the slack variables of the L2-SVMs. The asymptotic behavior of the solutions of the primal problems in the inseparable case was investigated. An ancillary dual problem is used as investigation tool. It is in reality a dual formulation of a quasi identical L2-SVM primal.