Communications of the ACM
A general lower bound on the number of examples needed for learning
Information and Computation
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
COLT '89 Proceedings of the second annual workshop on Computational learning theory
On Learning Sets and Functions
Machine Learning
A GENERALIZATION OF SAUER''S LEMMA
A GENERALIZATION OF SAUER''S LEMMA
DECISION THEORETIC GENERALIZATIONS OF THE PAC MODEL FORNEURAL NET AND OTHER LEARNING APPLICATIONS
DECISION THEORETIC GENERALIZATIONS OF THE PAC MODEL FORNEURAL NET AND OTHER LEARNING APPLICATIONS
Fat-shattering and the learnability of real-valued functions
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
Function Learning from Interpolation
Combinatorics, Probability and Computing
Universal ε-approximators for integrals
SODA '10 Proceedings of the twenty-first annual ACM-SIAM symposium on Discrete Algorithms
Hi-index | 0.00 |
We investigate the PAC learnability of classes of {0,…,n}-valued functions. For n = 1, it is known that the finiteness of the Vapnik-Chervonenkis dimension is necessary and sufficient for learning. In this paper we present a general scheme for extending the VC-dimension to the case n 1. Our scheme defines a wide variety of notions of dimension in which several variants of the VC-dimension, previously introduced in the context of learning, appear as special cases. Our main result is a simple condition characterizing the set of notions of dimension whose finiteness is necessary and sufficient for learning. This provides a variety of new tools for determining the learnability of a class of multi-valued functions. Our characterization is also shown to hold in the “robust” variant of PAC model.