A Bayesian Multiple Models Combination Method for Time Series Prediction
Journal of Intelligent and Robotic Systems
A tutorial on support vector regression
Statistics and Computing
Leave-One-Out Bounds for Support Vector Regression Model Selection
Neural Computation
Two-dimensional solution path for support vector regression
ICML '06 Proceedings of the 23rd international conference on Machine learning
Efficient Computation and Model Selection for the Support Vector Regression
Neural Computation
Combinations of weak classifiers
IEEE Transactions on Neural Networks
A New Solution Path Algorithm in Support Vector Regression
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
In order to improve the generalization performance of support vector regression (SVR), we propose a novel model combination method for SVR on regularization path. First, we construct the initial candidate model set using the regularization path, whose inherent piecewise linearity makes the construction easy and effective. Then, we elaborately select the models for combination from the initial model set through the improved Occam's Window method and the input-dependent strategy. Finally, we carry out the combination on the selected models using the Bayesian model averaging. Experimental results on benchmark data sets show that our combination method has significant advantage over the model selection methods based on generalized cross validation (GCV) and Bayesian information criterion (BIC). The results also verify that the improved Occam's Window method and the input-dependent strategy can enhance the predictive performance of the combination model.