On functional approximation with normalized Gaussian units

  • Authors:
  • Michel Benaim

  • Affiliations:
  • -

  • Venue:
  • Neural Computation
  • Year:
  • 1994

Quantified Score

Hi-index 0.00

Visualization

Abstract

Feedforward neural networks with a single hidden layer usingnormalized gaussian units are studied. It is proved that suchneural networks are capable of universal approximation in asatisfactory sense. Then, a hybrid learning rule as per Moody andDarken that combines unsupervised learning of hidden units andsupervised learning of output units is considered. By using themethod of ordinary differential equations for adaptive algorithms(ODE method) it is shown that the asymptotic properties of thelearning rule may be studied in terms of an autonomous cascade ofdynamical systems. Some recent results from Hirsch about cascadesare used to show the asymptotic stability of the learning rule.