Multilayer feedforward networks are universal approximators
Neural Networks
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Consistency of multilayer perceptron regression estimators
Neural Networks
Convergence rates for single hidden layer feedforward networks
Neural Networks
Some new results on neural network approximation
Neural Networks
Neural Network Learning: Theoretical Foundations
Neural Network Learning: Theoretical Foundations
An L2-boosting algorithm for estimation of a regression function
IEEE Transactions on Information Theory
Universal approximation bounds for superpositions of a sigmoidal function
IEEE Transactions on Information Theory
Nonparametric estimation via empirical risk minimization
IEEE Transactions on Information Theory
Hi-index | 0.00 |
Estimation of a regression function from data which consists of an independent and identically distributed sample of the underlying distribution with additional measurement errors in the independent variables is considered. It is allowed that the measurement errors are not independent and have a nonzero mean. It is shown that the rate of convergence of suitably defined least squares neural network estimates applied to this data is similar to the rate of convergence of least squares neural network estimates applied to an independent and identically distributed sample of the underlying distribution as long as the measurement errors are small.