Existence and uniqueness results for neural network approximations

  • Authors:
  • R. C. Williamson;U. Helmke

  • Affiliations:
  • Dept. of Syst. Eng., Australian Nat. Univ., Canberra, ACT;-

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1995

Quantified Score

Hi-index 0.00

Visualization

Abstract

Some approximation theoretic questions concerning a certain class of neural networks are considered. The networks considered are single input, single output, single hidden layer, feedforward neural networks with continuous sigmoidal activation functions, no input weights but with hidden layer thresholds and output layer weights. Specifically, questions of existence and uniqueness of best approximations on a closed interval of the real line under mean-square and uniform approximation error measures are studied. A by-product of this study is a reparametrization of the class of networks considered in terms of rational functions of a single variable. This rational reparametrization is used to apply the theory of Pade approximation to the class of networks considered. In addition, a question related to the number of local minima arising in gradient algorithms for learning is examined