A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Advances in kernel methods: support vector learning
Advances in kernel methods: support vector learning
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Training a Support Vector Machine in the Primal
Neural Computation
Rule Extraction from Support Vector Machines
Rule Extraction from Support Vector Machines
Almost Random Projection Machine
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
ICONIP '09 Proceedings of the 16th International Conference on Neural Information Processing: Part II
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part I
Hi-index | 0.00 |
Support Vector Machines (SVM's) with various kernels have become very successful in pattern classification and regression. However, single kernels do not lead to optimal data models. Replacing the input space by a kernel-based feature space in which the linear discrimination problem with margin maximization is solved is a general method that allows for mixing various kernels and adding new types of features. We show here how to generate locally optimized kernels that facilitate multi-resolution and can handle complex data distributions using simpler models than the standard data formulation may provide.