The nature of statistical learning theory
The nature of statistical learning theory
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Efficient SVM Regression Training with SMO
Machine Learning
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Support Vector Machine Regression for Volatile Stock Market Prediction
IDEAL '02 Proceedings of the Third International Conference on Intelligent Data Engineering and Automated Learning
A tutorial on support vector regression
Statistics and Computing
Training algorithms for fuzzy support vector machines with noisy data
Pattern Recognition Letters
Neural Computation
A Kalman Filter Primer (Statistics: Textbooks and Monographs)
A Kalman Filter Primer (Statistics: Textbooks and Monographs)
Support vector machine regression algorithm based on chunking incremental learning
ICCS'06 Proceedings of the 6th international conference on Computational Science - Volume Part I
A heuristic weight-setting algorithm for robust weighted least squares support vector regression
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Travel-time prediction with support vector regression
IEEE Transactions on Intelligent Transportation Systems
IEEE Transactions on Neural Networks
Robust support vector regression networks for function approximation with outliers
IEEE Transactions on Neural Networks
TSVR: An efficient Twin Support Vector Machine for regression
Neural Networks
Hi-index | 0.01 |
Weighted least-squares support vector machine (WLS-SVM) is an improved version of least-squares support vector machine (LS-SVM). It adds weights on error variables to correct the biased estimation of LS-SVM. Traditional weight-setting algorithm for WLS-SVM depends on results from unweighted LS-SVM and requires retraining of WLS-SVM. In this paper, a heuristic weight-setting method is proposed. This method derives from the idea of outlier mining, and is independent of unweighted LS-SVM. More importantly, a fast iterative updating algorithm is presented, which reaches the final results of WLS-SVM through a few updating steps instead of directly retraining WLS-SVM. Circumstantial experiments on simulated instances and real-world datasets are conducted, demonstrating comparable results of the proposed WLS-SVM and encouraging performance of the fast iterative updating algorithm.