Regularization theory and neural networks architectures
Neural Computation
Error bounds for approximation with neural networks
Journal of Approximation Theory
Efficient agnostic learning of neural networks with bounded fan-in
IEEE Transactions on Information Theory - Part 2
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
Learning a function from noisy samples at a finite sparse set of points
Journal of Approximation Theory
Hi-index | 0.00 |
In contrast to linear schemes, nonlinear approximation techniques allow for dimension independent rates of convergence. Unfortunately, typical algorithms (such as, e.g., backpropagation) are not only computationally demanding, but also unstable in the presence of data noise. While we can show stability for a weak relaxed greedy algorithm, the resulting method has the drawback that it requires in practise unavailable smoothness information about the data. In this work we propose an adaptive greedy algorithm which does not need this information but rather recovers it iteratively from the available data. We show that the generated approximations are always at least as smooth as the original function and that the algorithm also remains stable, when it is applied to noisy data. Finally, the applicability of this algorithm is demonstrated by numerical experiments.