MSVMpack: A Multi-Class Support Vector Machine Package
The Journal of Machine Learning Research
Estimating the class posterior probabilities in protein secondary structure prediction
PRIB'11 Proceedings of the 6th IAPR international conference on Pattern recognition in bioinformatics
Variational multinomial logit gaussian process
The Journal of Machine Learning Research
A generic model of multi-class support vector machine
International Journal of Intelligent Information and Database Systems
Cascading discriminant and generative models for protein secondary structure prediction
PRIB'12 Proceedings of the 7th IAPR international conference on Pattern Recognition in Bioinformatics
Network intrusion detection based on multi-class support vector machine
ICCCI'12 Proceedings of the 4th international conference on Computational Collective Intelligence: technologies and applications - Volume Part I
Hi-index | 0.00 |
To set the values of the hyperparameters of a support vector machine (SVM), the method of choice is cross-validation. Several upper bounds on the leave-one-out error of the pattern recognition SVM have been derived. One of the most popular is the radius-margin bound. It applies to the hard margin machine, and, by extension, to the 2-norm SVM. In this article, we introduce the first quadratic loss multi-class SVM: the M-SVM2. It can be seen as a direct extension of the 2-norm SVM to the multi-class case, which we establish by deriving the corresponding generalized radius-margin bound.