A Theory for Multiresolution Signal Decomposition: The Wavelet Representation
IEEE Transactions on Pattern Analysis and Machine Intelligence
Initializing back propagation networks with prototypes
Neural Networks
ACM Computing Surveys (CSUR)
Oscillatory and chaotic dynamics in neural networks under varying operating conditions
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
ScaleNet-multiscale neural-network architecture for time series prediction
IEEE Transactions on Neural Networks
Applied Artificial Intelligence
A Hybrid Method for Forecasting Stock Market Trend Using Soft-Thresholding De-noise Model and SVM
RSFDGrC '07 Proceedings of the 11th International Conference on Rough Sets, Fuzzy Sets, Data Mining and Granular Computing
Hi-index | 0.00 |
We train the wavelet packet multi-layer perceptron neural network (WP-MLP) by backpropagation for time series prediction. Weights in the backpropagation algorithm are usually initialized with small random values. If the random initial weights happen to be far from a good solution or they are near a poor local optimum, training may take a long time or get trap in the local optimum. Proper weights initialization will place the weights close to a good solution with reduced training time and increased the possibility of reaching a good solution. In this paper, we investigate the effect of weight initialization on WP-MLP using two clustering algorithms. We test the initialization methods on WP-MLP with the sunspots and Mackey-Glass benchmark time series. We show that with proper weight initialization, better prediction performance can be attained.