Matrix analysis
Optimal complexity recovery of band- and energy-limited signals
Journal of Complexity
Optimal complexity recovery of band- and energy-limited signals II
Journal of Complexity
Multilayer feedforward networks are universal approximators
Neural Networks
Approximation capabilities of multilayer feedforward networks
Neural Networks
Irregular sampling of bandlimited Lp-functions
Journal of Approximation Theory
The nature of statistical learning theory
The nature of statistical learning theory
Whittaker-Kotelnikov-Shannon sampling theorem and aliasing error
Journal of Approximation Theory
Neural Adaptive Control Technology
Neural Adaptive Control Technology
Applications of Neural Adaptive Control Technology
Applications of Neural Adaptive Control Technology
Feedforward Neural Network Methodology
Feedforward Neural Network Methodology
On the representation of band limited functions using finitely many bits
Journal of Complexity
Convergence Rates of Approximation by Translates
Convergence Rates of Approximation by Translates
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
The Stone-Weierstrass theorem and its application to neural networks
IEEE Transactions on Neural Networks
Gaussian networks for direct adaptive control
IEEE Transactions on Neural Networks
A perceptron network for functional identification and control of nonlinear systems
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
The complexity of neural networks in terms of the number of nodes required to obtain a degree of approximation has been widely analyzed in the literature. In the last decades it was proved that neural networks can defeat the curse of dimensionality under some conditions. The work surveyed in this paper suggests that there is a very different way to address this problem. Functions bandlimited in frequency are analyzed to overcome the adverse effect of the ''curse of dimensionality'' using a method based on Fourier analysis and uniform multi-dimensional sampling. Functions sufficiently smooth can be expanded in Gaussian series converging uniformly to the objective function. The fast decay of the Gaussian functions allows one to omit the terms in the infinite Gaussian series corresponding to samples outside an n-ball of finite radius surrounding an input vector causing a truncation error in the approximation. Bounds of the truncation errors are derived using bounds for the envelopment of the coefficients in the series. The most interesting result of this work is that functions bandlimited in frequency are not only free of the ''curse of dimensionality'' but furthermore the number of variables can be taken as an advantage and turned from upwind to downwind improving the approximation rates.