The nature of statistical learning theory
The nature of statistical learning theory
Robust Solutions to Least-Squares Problems with Uncertain Data
SIAM Journal on Matrix Analysis and Applications
Mathematics of Operations Research
Support vector machines: hype or hallelujah?
ACM SIGKDD Explorations Newsletter - Special issue on “Scalable data mining algorithms”
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
A Probabilistic Classification System for Predicting the Cellular Localization Sites of Proteins
Proceedings of the Fourth International Conference on Intelligent Systems for Molecular Biology
Robust solutions of uncertain linear programs
Operations Research Letters
Quadratic programming formulations for classificationand regression
Optimization Methods & Software - THE JOINT EUROPT-OMS CONFERENCE ON OPTIMIZATION, 4-7 JULY, 2007, PRAGUE, CZECH REPUBLIC, PART II
Robustness and Regularization of Support Vector Machines
The Journal of Machine Learning Research
Support vector machine classification of uncertain and imbalanced data using robust optimization
Proceedings of the 15th WSEAS international conference on Computers
Review: Supervised classification and mathematical optimization
Computers and Operations Research
Robust novelty detection in the framework of a contamination neighbourhood
International Journal of Intelligent Information and Database Systems
Robust novelty detection in the framework of a contamination neighbourhood
International Journal of Intelligent Information and Database Systems
Hi-index | 0.00 |
In this paper, we investigate the theoretical and numerical aspects of robust classification using support vector machines (SVMs) by providing second order cone programming and linear programming formulations. SVMs are learning algorithms introduced by Vapnik used either for classification or regression. They show good generalization properties and they are based on statistical learning theory. The resulting learning problems are convex optimization problems suitable for application of primal-dual interior points methods. We investigate the training of a SVM in the case where a bounded perturbation is added to the value of an input xi∈n. A robust SVM provides a decision function that is immune to data perturbations. We consider both cases where our training data are either linearly separable or non linearly separable respectively and provide computational results for real data sets.