Initializing weights of a multilayer perceptron network by using the orthogonal least squares algorithm

  • Authors:
  • Mikko Lehtokangas;Jukka Saarinen;Kimmo Kaski;Pentti Huuhtanen

  • Affiliations:
  • -;-;-;-

  • Venue:
  • Neural Computation
  • Year:
  • 1995

Quantified Score

Hi-index 0.00

Visualization

Abstract

Usually the training of a multilayer perceptron network startsby initializing the network weights with small random values, andthen the weight adjustment is carried out by using an iterativegradient descent-based optimization routine called backpropagationtraining. If the random initial weights happen to be far from agood solution or they are near a poor local optimum, the trainingwill take a lot of time since many iteration steps are required.Furthermore, it is very possible that the network will not convergeto an adequate solution at all. On the other hand, if the initialweights are close to a good solution the training will be muchfaster and the possibility of obtaining adequate convergenceincreases. In this paper a new method for initializing the weightsis presented. The method is based on the orthogonal least squaresalgorithm. The simulation results obtained with the proposedinitialization method show a considerable improvement in trainingcompared to the randomly initialized networks. In light ofpractical experiments, the proposed method has proven to be fastand useful for initializing the network weights.