An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Support vector machines with different norms: motivation, formulations and results
Pattern Recognition Letters
Multi-Classification by Using Tri-Class SVM
Neural Processing Letters
Geometrical Properties of Nu Support Vector Machines with Different Norms
Neural Computation
Dual unification of bi-class support vector machine formulations
Pattern Recognition
Rapid and brief communication: Unified dual for bi-class SVM approaches
Pattern Recognition
Automatic recognition of frog calls using a multi-stage average spectrum
Computers & Mathematics with Applications
A study on output normalization in multiclass SVMs
Pattern Recognition Letters
Hi-index | 0.09 |
In this paper, a generalization of support vector machines is explored where it is considered that input vectors have different @?"p norms for each class. It is proved that the optimization problem for binary classification by using the maximal margin principle with @?"p and @?"q norms only depends on the @?"p norm if 1@?p@?q. Furthermore, the selection of a different bias in the classifier function is a consequence of the @?"q norm in this approach. Some commentaries on the most commonly used approaches of SVM are also given as particular cases.