When is approximation by Gaussian networks necessarily a linear process?

  • Authors:
  • H. N. Mhaskar

  • Affiliations:
  • Department of Mathematics, California State University, Los Angeles, CA

  • Venue:
  • Neural Networks
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

Let s ≥ 1 be an integer. A Gaussian network is a function on Rs of the form g(X) = Σk=1N ak exp(-||x-xk||2). The minimal separation among the centers, defined by (1/2)min1≤j≠k≤N ||Xj-Xk||, is an important characteristic of the network that determines the stability of interpolation by Gaussian networks, the degree of approximation by such networks, etc. Let (within this abstract only) the set of all Gaussian networks with minimal separation exceeding 1/m be denoted by Gm. We prove that for functions f ∈ L2 (Rs) such that ||f||Rs\[-t,t]s=O(t-β),if the degree of L2 (nonlinear) approximation of f from Gm, is O(m-β), then necessarily the degree of approximation of f by (rectangular) partial sums of degree m2 of the Hermite expansion of f is also O(m-β). Moreover, Gaussian networks in Gm having fixed centers in a ball of radius O(m) and coefficients being linear functionals of f can be constructed to yield the same degree of approximation. Similar results are proved for the Lp norms, 1 ≤ p ≤ ∞, but with the condition that the number of neurons N, should satisfy logN=O(m2).