The nature of statistical learning theory
The nature of statistical learning theory
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Sparse bayesian learning and the relevance vector machine
The Journal of Machine Learning Research
Exact simplification of support vector solutions
The Journal of Machine Learning Research
An efficient method for simplifying support vector machines
ICML '05 Proceedings of the 22nd international conference on Machine learning
Input space versus feature space in kernel-based methods
IEEE Transactions on Neural Networks
A study on reduced support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In this paper, we proposed a method to speed up the test phase of SVM based on Feature Vector Selection method (FVS). In the method, the support vectors (SVs) appeared in the decision function of SVM are replaced with some feature vectors (FVs) which are selected from support vectors by FVS method. Since it is a subset of SVs set, the size of FVs set is normally smaller than that of the SVs set, therefore the decision process of SVM is speeded up. Experiments on 12 datasets of IDA show that the number of SVs can be reduced from 20% to 99% with only a slight increase on the error rate of SVM by the proposed method. The trade-off between the generalization ability of obtained SVM and the speedup ability of the proposed method can be easily controlled by one parameter.