An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Exact simplification of support vector solutions
The Journal of Machine Learning Research
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
Sparseness of support vector machines
The Journal of Machine Learning Research
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Improvements to Platt's SMO Algorithm for SVM Classifier Design
Neural Computation
A study on reduced support vector machines
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support vector machines (SVMs) are well known to give good results on many pattern recognition problems, but they exhibit classification speeds that are substantially slower than those of neural networks. One can speed up SVM classification by reducing the complexity of the decision function, which can be obtained by decreasing the number of support vectors. An iterative process is proposed to prune SVM and avoid obvious decline in classification accuracy. Computational results indicate that the number of support vectors is related to the size of training set, and that simplified SVMs with many fewer support vectors have classification accuracy nearly equal to that of original SVM. The proposed method is also compared with previous research, with results that support it as an effective method to obtain a simplified SVM for large problems.