Real and complex analysis, 3rd ed.
Real and complex analysis, 3rd ed.
Universal approximation using radial-basis-function networks
Neural Computation
Journal of Approximation Theory
On the representation of band limited functions using finitely many bits
Journal of Complexity
Bounds on rates of variable-basis and neural-network approximation
IEEE Transactions on Information Theory
Comparison of worst case errors in linear and neural network approximation
IEEE Transactions on Information Theory
Complexity of Gaussian-radial-basis networks approximating smooth functions
Journal of Complexity
Approximation of Sobolev classes by polynomials and ridge functions
Journal of Approximation Theory
Hi-index | 0.00 |
Let s ≥ 1 be an integer. A Gaussian network is a function on Rs of the form g(X) = Σk=1N ak exp(-||x-xk||2). The minimal separation among the centers, defined by (1/2)min1≤j≠k≤N ||Xj-Xk||, is an important characteristic of the network that determines the stability of interpolation by Gaussian networks, the degree of approximation by such networks, etc. Let (within this abstract only) the set of all Gaussian networks with minimal separation exceeding 1/m be denoted by Gm. We prove that for functions f ∈ L2 (Rs) such that ||f||Rs\[-t,t]s=O(t-β),if the degree of L2 (nonlinear) approximation of f from Gm, is O(m-β), then necessarily the degree of approximation of f by (rectangular) partial sums of degree m2 of the Hermite expansion of f is also O(m-β). Moreover, Gaussian networks in Gm having fixed centers in a ball of radius O(m) and coefficients being linear functionals of f can be constructed to yield the same degree of approximation. Similar results are proved for the Lp norms, 1 ≤ p ≤ ∞, but with the condition that the number of neurons N, should satisfy logN=O(m2).