Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A few notes on statistical learning theory
Advanced lectures on machine learning
PAC learnability under non-atomic measures: A problem by Vidyasagar
Theoretical Computer Science
Hi-index | 0.00 |
In response to a 1997 problem of M. Vidyasagar, we state a necessary and sufficient condition for distribution-free PAC learnability of a concept class C under the family of all non-atomic (diffuse) measures on the domain Ω. Clearly, finiteness of the classical Vapnik-Chervonenkis dimension of C is a sufficient, but no longer necessary, condition. Besides, learnability of C under non-atomic measures does not imply the uniform Glivenko-Cantelli property with regard to non-atomic measures. Our learnability criterion is stated in terms of a combinatorial parameter VC(C mod ω1) which we call the VC dimension of C modulo countable sets. The new parameter is obtained by "thickening up" single points in the definition of VC dimension to uncountable "clusters". Equivalently, VC(C mod ω1) ≤ d if and only if every countable subclass of C has VC dimension ≤ d outside a countable subset of Ω. The new parameter can be also expressed as the classical VC dimension of C calculated on a suitable subset of a compactification of Ω. We do not make any measurability assumptions on C, assuming instead the validity of Martin's Axiom (MA).