Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
More theorems about scale-sensitive dimensions and learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
Scale-sensitive dimensions, uniform convergence, and learnability
Journal of the ACM (JACM)
A few notes on statistical learning theory
Advanced lectures on machine learning
PAC learnability of a concept class under non-atomic measures: a problem by Vidyasagar
ALT'10 Proceedings of the 21st international conference on Algorithmic learning theory
SBRN '10 Proceedings of the 2010 Eleventh Brazilian Symposium on Neural Networks
Hi-index | 5.23 |
In response to a 1997 problem of M. Vidyasagar, we state a criterion for PAC learnability of a concept class C under the family of all non-atomic (diffuse) measures on the domain @W. The uniform Glivenko-Cantelli property with respect to non-atomic measures is no longer a necessary condition, and consistent learnability cannot in general be expected. Our criterion is stated in terms of a combinatorial parameter VC(Cmod@w"1) which we call the VC dimension of C modulo countable sets. The new parameter is obtained by ''thickening up'' single points in the definition of VC dimension to uncountable ''clusters''. Equivalently, VC(Cmod@w"1)@?d if and only if every countable subclass of C has VC dimension @?d outside a countable subset of @W. The new parameter can be also expressed as the classical VC dimension of C calculated on a suitable subset of a compactification of @W. We do not make any measurability assumptions on C, assuming instead the validity of Martin's Axiom (MA). Similar results are obtained for function learning in terms of the fat-shattering dimension modulo countable sets, but, just like in the classical distribution-free case, the finiteness of this parameter is sufficient but not necessary for PAC learnability under non-atomic measures.