Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Confidence interval prediction for neural network models
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
The kernel minimum squared error estimation (KMSE) model can be viewed as a general framework that includes kernel Fisher discriminant analysis (KFDA), least squares support vector machine (LS-SVM), and kernel ridge regression (KRR) as its particular cases. For continuous real output the equivalence of KMSE and LS-SVM is shown in this paper. We apply standard methods for computing prediction intervals in nonlinear regression to KMSE model. The simulation results show that LS-SVM has better performance in terms of the prediction intervals and mean squared error(MSE). The experiment on a real date set indicates that KMSE compares favorably with other method.