ICCS '01 Proceedings of the International Conference on Computational Science-Part II
Book reviews: Application of neural networks to adaptive control of nonlinear systems
Automatica (Journal of IFAC)
Mathematical and Computer Modelling: An International Journal
Approximation properties of local bases assembled from neural network transfer functions
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.00 |
Usually the training of a multilayer perceptron network startsby initializing the network weights with small random values, andthen the weight adjustment is carried out by using an iterativegradient descent-based optimization routine called backpropagationtraining. If the random initial weights happen to be far from agood solution or they are near a poor local optimum, the trainingwill take a lot of time since many iteration steps are required.Furthermore, it is very possible that the network will not convergeto an adequate solution at all. On the other hand, if the initialweights are close to a good solution the training will be muchfaster and the possibility of obtaining adequate convergenceincreases. In this paper a new method for initializing the weightsis presented. The method is based on the orthogonal least squaresalgorithm. The simulation results obtained with the proposedinitialization method show a considerable improvement in trainingcompared to the randomly initialized networks. In light ofpractical experiments, the proposed method has proven to be fastand useful for initializing the network weights.