Recursive reduced least squares support vector regression
Pattern Recognition
Efficient Optimization of the Parameters of LS-SVM for Regression versus Cross-Validation Error
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part II
Optimized fixed-size kernel models for large data sets
Computational Statistics & Data Analysis
Hi-index | 0.00 |
In this paper, a fast leave-one-out (LOO) evaluation formula is introduced for least squares support vector machine (LS-SVM) classifiers. The computation cost can be reduced to approximately 1/N when compared to normal LOO procedure (is the number of training samples). Inspired by its fast speed, we are able to use it to replace the original Level 3 posterior probability approximation formula of the Bayesian framework [Bayesian framework for least squares support vector machine classifiers, gaussian processes and kernel fisher discriminant analysis] for LS-SVM classifiers. The improved inference framework shows higher generalization performance and faster computation speed.