The nature of statistical learning theory
The nature of statistical learning theory
Generalization performance of support vector machines and other pattern classifiers
Advances in kernel methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
Radius margin bounds for support vector machines with the RBF kernel
Neural Computation
A Support Vector Machine with a Hybrid Kernel and Minimal Vapnik-Chervonenkis Dimension
IEEE Transactions on Knowledge and Data Engineering
Efficient tuning of SVM hyperparameters using radius/margin bound and iterative algorithms
IEEE Transactions on Neural Networks
A fast and sparse implementation of multiclass kernel perceptron algorithm
ISNN'06 Proceedings of the Third international conference on Advances in Neural Networks - Volume Part I
Hi-index | 0.00 |
The VC dimension bound of the set of separating hyperplanes is evaluated by the ratio of squared radius of smallest sphere to squared margin. Choosing some kernel and its parameters means that the radius is fixed. In SVM with hard margin, the ratio is minimized through minimizing squared 2-norm of weight vector. In this paper, a bound for squared radius in the feature space is built, which depends on the scaling factor of RBF kernel and the squared radius bound in the input space. The squared 2-norm of weight vector is described as a quadratic form. Therefore, a simple VC dimension bound with RBF kernel is proposed for classification. Based on minimizing this bound, two constrained nonlinear programming problems are constructed for the linearly and nonlinearly separable cases. Through solving them, we can design the nonlinear classifiers with RBF kernel and determine the scaling factor of RBF kernel simultaneously.