Regularization theory and neural networks architectures
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Pairwise classification and support vector machines
Advances in kernel methods
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Training Support Vector Machines: an Application to Face Detection
CVPR '97 Proceedings of the 1997 Conference on Computer Vision and Pattern Recognition (CVPR '97)
On the algorithmic implementation of multiclass kernel-based vector machines
The Journal of Machine Learning Research
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
Neural Computation
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
A fast iterative nearest point algorithm for support vector machine classifier design
IEEE Transactions on Neural Networks
A comparison of methods for multiclass support vector machines
IEEE Transactions on Neural Networks
Classification by evolutionary generalised radial basis functions
International Journal of Hybrid Intelligent Systems - Advances in Intelligent Agent Systems
A sequential minimal optimization algorithm for the all-distances support vector machine
CIARP'10 Proceedings of the 15th Iberoamerican congress conference on Progress in pattern recognition, image analysis, computer vision, and applications
Reduced universal background model for speech recognition and identification system
MCPR'12 Proceedings of the 4th Mexican conference on Pattern Recognition
Hi-index | 0.00 |
The margin maximization principle implemented by binary Support Vector Machines (SVMs) has been shown to be equivalent to find the hyperplane equidistant to the closest points belonging to the convex hulls that enclose each class of examples. In this paper, we propose an extension of SVMs for multicategory classification which generalizes this geometric formulation. The obtained method preserves the form and complexity of the binary case, optimizing a single convex quadratic program where each new class introduces just one additional constraint. Reduced convex hulls and non-linear kernels, used in the binary case to deal with the non-linearly separable case, can be also implemented by our algorithm to obtain additional flexibility. Experimental results in well known datasets are presented, comparing our method with two widely used multicategory SVMs extensions.