The nature of statistical learning theory
The nature of statistical learning theory
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Sparseness of support vector machines
The Journal of Machine Learning Research
Pruning error minimization in least squares support vector machines
IEEE Transactions on Neural Networks
Model selection for the LS-SVM. Application to handwriting recognition
Pattern Recognition
Evolution strategies based adaptive Lp LS-SVM
Information Sciences: an International Journal
A novel heuristic for building reduced-set SVMs using the self-organizing map
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part I
Fast opposite maps: an iterative SOM-Based method for building reduced-set SVMs
IDEAL'12 Proceedings of the 13th international conference on Intelligent Data Engineering and Automated Learning
Efficient sparse least squares support vector machines for pattern classification
Computers & Mathematics with Applications
Hi-index | 0.01 |
The least-squares support vector machines (LS-SVM) can be obtained by solving a simpler optimization problem than that in standard support vector machines (SVM). Its shortcoming is the loss of sparseness and this usually results in slow testing speed. Several pruning methods have been proposed. It is found that these methods can be further improved for classification problems. In this paper a different reduced training set is selected to re-train LS-SVM. Then a new procedure is proposed to obtain the sparseness. The performance of the proposed method is compared with other typical ones and the results indicate that it is more effective.