Consistency of multilayer perceptron regression estimators

  • Authors:
  • Jan Mielniczuk;Joanna Tyrcha

  • Affiliations:
  • Institute of Computer Science, Poland;Institute of Computer Science, Poland

  • Venue:
  • Neural Networks
  • Year:
  • 1993

Quantified Score

Hi-index 0.01

Visualization

Abstract

In the paper three layer perceptron with one hidden layer and the output layer consisting of one neuron is considered. This is commonly used architecture to solve regression problems where such a perceptron minimizing the mean squared error criterion for the data points (x"k, y"k), k = 1,..., N is sought. It is shown that in the model: y"k = g"0(x"k) + @ek, k = 1, ... N, where x"k is independent from zero mean error term @e"k, this procedure is consistent when N - ~, provided that g"0 is represented as three layer perceptron with Heaviside transfer fucntion. The same result is true when transfer function is an arbitrary continuous function with bounded limits at +/-~ and the hidden-to-output weights in the considered family of perceptions are bounded.