Nonlinear approximation theory
Nonlinear approximation theory
What size net gives valid generalization?
Neural Computation
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
Bounds for the computational power and learning complexity of analog neural nets
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Feedforward nets for interpolation and classification
Journal of Computer and System Sciences
A result of Vapnik with applications
Discrete Applied Mathematics
Fat-shattering and the learnability of real-valued functions
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
Local algorithms for pattern recognition and dependencies estimation
Neural Computation
Polynomial bounds for VC dimension of sigmoidal neural networks
STOC '95 Proceedings of the twenty-seventh annual ACM symposium on Theory of computing
More theorems about scale-sensitive dimensions and learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Function learning from interpolation
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Approximation and learning of convex superpositions
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
Efficient agnostic learning of neural networks with bounded fan-in
IEEE Transactions on Information Theory - Part 2
The VC Dimension for Mixtures of Binary Classifiers
Neural Computation
Sensitivity Analysis for Selective Learning by Feedforward Neural Networks
Fundamenta Informaticae
Sensitivity Analysis for Selective Learning by Feedforward Neural Networks
Fundamenta Informaticae
Brief Finite sample properties of system identification of ARX models under mixing conditions
Automatica (Journal of IFAC)
Sensitivity Analysis for Selective Learning by Feedforward Neural Networks
Fundamenta Informaticae
Sensitivity Analysis for Selective Learning by Feedforward Neural Networks
Fundamenta Informaticae
Hi-index | 0.00 |
We show how lower bounds on the generalization ability of feedforward neural nets with real outputs can be derived within a formalism based directly on the concept of VC dimension and Vapnik's theorem on uniform convergence of estimated probabilities.