ScaleNet-multiscale neural-network architecture for time series prediction

  • Authors:
  • A. B. Geva

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., Ben-Gurion Univ. of the Negev, Beer-Sheva

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

The effectiveness of a multiscale neural net architecture for time series prediction of nonlinear dynamic systems is investigated. The prediction task is simplified by decomposing different scales of past windows into different scales of wavelets, and predicting the coefficients of each scale of wavelets by means of a separate multilayer perceptron. The short-term history is decomposed into the lower scales of wavelet coefficients, which are utilized for detailed analysis and prediction, while the long-term history is decomposed into higher scales of wavelet coefficients that are used for the analysis and prediction of slow trends in the time series. These coordinated scales of time and frequency provide an interpretation of the series structures, and more information about the history of the series, using fewer coefficients than other methods. Results concerning scales of time and frequencies are combined by another expert perceptron, which learns the weight of each scale in the goal-prediction of the original time series. Each network is trained by backpropagation. The weights and biases are initialized by a clustering algorithm of the temporal patterns of the time series, which improves the prediction results as compared to random initialization. The suggested multiscale architecture outperforms the corresponding single-scale architectures. The employment of improved learning methods for each of the ScaleNet networks can further improve the prediction results