The nature of statistical learning theory
The nature of statistical learning theory
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Sparse Kernel ridge regression using backward deletion
PRICAI'06 Proceedings of the 9th Pacific Rim international conference on Artificial intelligence
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Incorporating a priori knowledge from detractor points into support vector classification
ICANNGA'11 Proceedings of the 10th international conference on Adaptive and natural computing algorithms - Volume Part II
Hi-index | 0.00 |
Support Vector Regression (SVR) is one of the most famous sparse kernel machines which inherits many advantages of Support Vector Machines (SVM). However, since the number of support vectors grows rapidly with the increase of training samples, sparseness of the SVR is sometimes insufficient. In this paper, we propose two methods which reduce the SVR support vectors using backward deletion. Experiments show our method can dramatically reduce the number of support vectors without sacrificing the generalization performance.