The nature of statistical learning theory
The nature of statistical learning theory
Backpropagation: basics and new developments
The handbook of brain theory and neural networks
A better approximation for balls
Journal of Approximation Theory
Feedforward Neural Network Methodology
Feedforward Neural Network Methodology
Best approximation by linear combinations of characteristic functions of half-spaces
Journal of Approximation Theory
Error Estimates for Approximate Optimization by the Extended Ritz Method
SIAM Journal on Optimization
Learning with generalization capability by kernel methods of bounded complexity
Journal of Complexity
Hi-index | 0.00 |
Complexity of data with respect to a particular class of neural networks is studied. Data complexity is measured by the magnitude of a certain norm of either the regression function induced by a probability measure describing the data or a function interpolating a sample of input/output pairs of training data chosen with respect to this probability. The norm is tailored to a type of computational units in the network class. It is shown that for data for which this norm is "small", convergence of infima of error functionals over networks with increasing number of hidden units to the global minima is relatively fast. Thus for such data, networks with a reasonable model complexity can achieve good performance during learning. For perceptron networks, the relationship between data complexity, data dimensionality and smoothness is investigated.