Functional approximation by feed-forward networks: a least-squares approach to generalization

  • Authors:
  • A. R. Webb

  • Affiliations:
  • Defence Res. Inst., Great Malvern

  • Venue:
  • IEEE Transactions on Neural Networks
  • Year:
  • 1994

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper considers a least-squares approach to function approximation and generalization. The particular problem addressed is one in which the training data are noiseless and the requirement is to define a mapping that approximates the data and that generalizes to situations in which data samples are corrupted by noise in the input variables. The least-squares approach produces a generalizer that has the form of a radial basis function network for a finite number of training samples. The finite sample approximation is valid provided that the perturbations due to noise on the expected operating conditions are large compared to the sample spacing in the data space. In the other extreme of small noise perturbations, a particular parametric form must be assumed for the generalizer. It is shown that better generalization will occur if the error criterion used in training the generalizer is modified by the addition of a specific regularization term. This is illustrated by an approximator that has a feedforward architecture and is applied to the problem of point-source location using the outputs of an array of receivers in the focal-plane of a lens