The nature of statistical learning theory
The nature of statistical learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Reducing multiclass to binary: a unifying approach for margin classifiers
The Journal of Machine Learning Research
Use of the zero norm with linear models and kernel methods
The Journal of Machine Learning Research
Accurately learning from few examples with a polyhedral classifier
Computational Optimization and Applications
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Support Vector Machines with the Ramp Loss and the Hard Margin Loss
Operations Research
Hi-index | 0.00 |
Discrete support vector machines are models for classification recently introduced in the context of statistical learning theory. Their distinctive feature is the formulation of mixed integer programming problems aimed at deriving optimal separating hyperplanes with minimum empirical error and maximum generalization capability. A new family of discrete SVM is proposed in this paper, for which the hyperplane establishes a variable softening of the margin to improve the separation among distinct classes. Theoretical bounds are derived to finely tune the parameters of the optimization problem. Computational tests on benchmark datasets in the biolife science application domain indicate the effectiveness of the proposed approach, that appears dominating against traditional SVM in terms of accuracy and percentage of support vectors.