Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Learnability by fixed distributions
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Learning Changing Concepts by Exploiting the Structure of Change
Machine Learning
Structural Complexity and Neural Networks
WIRN VIETRI 2002 Proceedings of the 13th Italian Workshop on Neural Nets-Revised Papers
Hi-index | 0.00 |
In this work we study the relationship between PAC learning and theproperty of uniform convergence. We define the concept ofpolynomial uniform convergence ofrelative frequencies to probabilities in thedistribution–dependent context. LetXn =(0,1)n, letPn be aprobability distribution onXn and letFn⊂2xn be a class of events. The family{(Xn, Pn,Fn)}n≥1 is said to bepolynomially uniformly convergent if, for alln, the probability that the maximum difference (overFn) between therelative frequency and probability of an event exceed a given positive&egr; is at most &dgr; (0 n, 1/&egr;, 1/&dgr;. Givenat-sample(x1,…,xt),letCn(t)(x1,…,xt)be the Vapnik-Chervonenkis dimension (VCdim) of the set(x1,…xt∩ f | f e Fn andM(n,t) the expectationE(Cn(t)/t).The results we obtain are:1. (Xn,Pn,Fn)n≥1 is polynomially uniformly convergent iffthere exists &bgr; 0 such thatM(n,t)=O(n/t&bgr;).2. The family {(Xn,Fn)}n≥1 is polynomially uniformly convergent forall probability distributionsPn onXn iff VCdim(Fn) isbounded by a polynomial p(n) iff (Xn, Fn)≥1 ispolynomial–sample learnable.3. If (Xn, Pn,Fn) is polynomially uniformly convergent then(Xn, Pn, Fn)≥1 is polynomial–sample learnable, butthere exist polynomial–sample learnable families(Xn, Pn, Fn)≥1 which do not satisfy the property ofpolynomial uniform convergence.