A Theory for Multiresolution Signal Decomposition: The Wavelet Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Predicting Time Series with Support Vector Machines
ICANN '97 Proceedings of the 7th International Conference on Artificial Neural Networks
Learning Chaotic Attractors by Neural Networks
Neural Computation
Fast learning in networks of locally-tuned processing units
Neural Computation
Prediction of noisy chaotic time series using an optimal radial basis function neural network
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper considers the prediction of chaotic time series by proposed multi-scale Gaussian processes (MGP) models, an extension of classical Gaussian processes (GP) model. Unlike the GP spending much time to find the optimal hyperparameters, MGP employs a covariance function that is constructed by a scaling function with its different dilations and translations, ensuring that the optimal hyperparameter is easy to determine. Moreover, the scaling function with its different dilations and translations can form a set of complete bases, resulting in that the MGP can acquire better prediction performance than GP. The effectiveness of MGP is evaluated using simulated Mackey-Glass series as well as real-world electric load series. Results show the proposed model outperforms GP on prediction performance, and takes much less time to determine hyperparameter. Results also show that the performance of MGP is competitive with support vector machine (SVM). They give better performance compared to the radial basis function (RBF) networks.