The nature of statistical learning theory
The nature of statistical learning theory
Support vector machines are universally consistent
Journal of Complexity
Inference for the Generalization Error
Machine Learning
The Journal of Machine Learning Research
Overfitting in making comparisons between variable selection methods
The Journal of Machine Learning Research
No Unbiased Estimator of the Variance of K-Fold Cross-Validation
The Journal of Machine Learning Research
SVM Based Decision Analysis and Its Granular-Based Solving
ICCSA '09 Proceedings of the International Conference on Computational Science and Its Applications: Part II
Asymptotic normality of support vector machine variants and other regularized kernel methods
Journal of Multivariate Analysis
Two online dam safety monitoring models based on the process of extracting environmental effect
Advances in Engineering Software
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
Support vector machine (SVM) is one of the most popular and promising classification algorithms. After a classification rule is constructed via the SVM, it is essential to evaluate its prediction accuracy. In this paper, we develop procedures for obtaining both point and interval estimators for the prediction error. Under mild regularity conditions, we derive the consistency and asymptotic normality of the prediction error estimators for SVM with finite-dimensional kernels. A perturbation-resampling procedure is proposed to obtain interval estimates for the prediction error in practice. With numerical studies on simulated data and a benchmark repository, we recommend the use of interval estimates centered at the cross-validated point estimates for the prediction error. Further applications of the proposed procedure in model evaluation and feature selection are illustrated with two examples.