Communications of the ACM
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Finiteness results for sigmoidal “neural” networks
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
The nature of statistical learning theory
The nature of statistical learning theory
Sample sizes for sigmoidal neural networks
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Theoretical Advances in Neural Computation and Learning
Theoretical Advances in Neural Computation and Learning
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Nonparametric estimation via empirical risk minimization
IEEE Transactions on Information Theory
On Fusers that Perform Better than Best Sensor
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
We address the problem of estimating a function f: [0,1] ^d↦ [-L,L] by using feedforward sigmoidal networks with a single hiddenlayer and bounded weights. The only information about the function isprovided by an identically independently distributed sample generatedaccording to an unknown distribution. The quality of the estimate isquantified by the expected cost functional and depends on the sample size.We use Lipschitz properties of the cost functional and of the neuralnetworks to derive the relationship between performance bounds and samplesizes within the framework of Valiant‘s probably approximately correctlearning.