Spatial tessellations: concepts and applications of Voronoi diagrams
Spatial tessellations: concepts and applications of Voronoi diagrams
The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Using analytic QP and sparseness to speed training of support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Large Scale Kernel Regression via Linear Programming
Machine Learning
Convergence of a Generalized SMO Algorithm for SVM Classifier Design
Machine Learning
Machine Learning by Function Decomposition
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Iterative Methods for Sparse Linear Systems
Iterative Methods for Sparse Linear Systems
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Benchmarking a Reduced Multivariate Polynomial Pattern Classifier
IEEE Transactions on Pattern Analysis and Machine Intelligence
Lookahead-based algorithms for anytime induction of decision trees
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Robust support vector machines for classification and computational issues
Optimization Methods & Software - Systems Analysis, Optimization and Data Mining in Biomedicine
On the convergence of the decomposition method for support vector machines
IEEE Transactions on Neural Networks
Asymptotic convergence of an SMO algorithm without any assumptions
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
We reformulate the support vector machine approach to classification and regression problems using a different methodology than the classical 'largest margin' paradigm. From this, we are able to derive extremely simple quadratic programming problems that allow for general symbolic solutions to the classical problems of geometric classification and regression. We obtain a new class of learning machines that are also robust to the presence of small perturbations and/or corrupted or missing data in the training sets (provided that information about the amplitude of the perturbations is known approximately). A high performance framework for very large-scale classification and regression problems based on a Voronoi tessellation of the input space is also introduced in this work. Our approach has been tested on seven benchmark databases with noticeable gain in computational time in comparison with standard decomposition techniques such as SVMlight.