Communications of the ACM
Radial basis functions for multivariable interpolation: a review
Algorithms for approximation
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
What size net gives valid generalization?
Neural Computation
Introduction to the theory of neural computation
Introduction to the theory of neural computation
A training algorithm for optimal margin classifiers
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Lower bounds on the Vapnik-Chervonenkis dimension of multi-layer threshold networks
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On the power of polynomial discriminators and radial basis function networks
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Average-case learning curves for radial basis function networks
Neural Computation
Hi-index | 0.00 |
We examine the relationship between the VC-dimension and the number of parameters of a smoothly parametrized function class. We show that the VC-dimension of such a function class is at least k if there exists a k-dimensional differentiable manifold in the parameter space such that each member of the manifold corresponds to a different decision boundary. Using this result, we are able to obtain lower bounds on the VC-dimension proportional to the number of parameters for several function classes including two-layer neural networks with certain smooth activation functions and radial basis functions with a gaussian basis. These lower bounds hold even if the magnitudes of the parameters are restricted to be arbitarily small. In Valiant's probably approximately correct learning framework, this implies that the number of example necessary for learning these function classes is at least linear in the number of parameters.