Spectral theory of self-adjoint operators in Hilbert space
Spectral theory of self-adjoint operators in Hilbert space
Geometry and topology of continuous best and near best approximations
Journal of Approximation Theory
Best approximation by Heaviside perceptron networks
Neural Networks
Journal of Approximation Theory
Complexity of Gaussian-radial-basis networks approximating smooth functions
Journal of Complexity
An integral upper bound for neural network approximation
Neural Computation
Model Complexity of Neural Networks and Integral Transforms
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Comparison of worst case errors in linear and neural network approximation
IEEE Transactions on Information Theory
On the exponential convergence of matching pursuits in quasi-incoherent dictionaries
IEEE Transactions on Information Theory
Geometric Upper Bounds on Rates of Variable-Basis Approximation
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Capabilities of linear and neural-network models are compared from the point of view of requirements on the growth of model complexity with an increasing accuracy of approximation. Upper bounds on worst-case errors in approximation by neural networks are compared with lower bounds on these errors in linear approximation. The bounds are formulated in terms of singular numbers of certain operators induced by computational units and high-dimensional volumes of the domains of the functions to be approximated.