The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
Advances in kernel methods: support vector learning
Advances in kernel methods: support vector learning
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
AI Game Programming Wisdom
Advances in Large Margin Classifiers
Advances in Large Margin Classifiers
Support vector machines with different norms: motivation, formulations and results
Pattern Recognition Letters
Duality and Geometry in SVM Classifiers
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
An asymptotic statistical theory of polynomial kernel methods
Neural Computation
Geometry and learning curves of kernel methods with polynomial kernels
Systems and Computers in Japan
Neural Computation
Support vector machines with adaptive Lq penalty
Computational Statistics & Data Analysis
Artificial Intelligence in Medicine
Support vector machines for classification of input vectors with different metrics
Computers & Mathematics with Applications
Hi-index | 0.00 |
By employing the L1 or L∞ norms in maximizing margins, support vector machines (SVMs) result in a linear programming problem that requires a lower computational load compared to SVMs with the L2 norm. However, how the change of norm affects the generalization ability of SVMs has not been clarified so far except for numerical experiments. In this letter, the geometrical meaning of SVMs with the Lp norm is investigated, and the SVM solutions are shown to have rather little dependency on p.