Estimates of covering numbers of convex sets with slowly decaying orthogonal subsets
Discrete Applied Mathematics
Bounds on rates of variable-basis and neural-network approximation
IEEE Transactions on Information Theory
On the geometric convergence of neural approximations
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Model complexity of feedforward neural networks is studied in terms of rates of variable-basis approximation. Sets of functions, for which the errors in approximation by neural networks with n hidden units converge to zero geometrically fast with increasing number n, are described. However, the geometric speed of convergence depends on parameters, which are specific for each function to be approximated. The results are illustrated by examples of estimates of such parameters for functions in infinite-dimensional Hilbert spaces.