Communications of the ACM
Classifying learnable geometric concepts with the Vapnik-Chervonenkis dimension
STOC '86 Proceedings of the eighteenth annual ACM symposium on Theory of computing
A general lower bound on the number of examples needed for learning
COLT '88 Proceedings of the first annual workshop on Computational learning theory
Investigating the distribution assumptions in the Pac learning model
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Learnability with respect to fixed distributions
Theoretical Computer Science
Learning simple concepts under simple distributions
SIAM Journal on Computing
A parameterization scheme for classifying models of PAC learnability
Information and Computation
ICALP '88 Proceedings of the 15th International Colloquium on Automata, Languages and Programming
Statistical queries and faulty PAC oracles
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
A Fixed-Distribution PAC Learning Theory for Neural FIR Models
Journal of Intelligent Information Systems
Hi-index | 0.00 |
We consider PAC-learning where the distribution is known to the student. The problem addressed here is characterizing when learnability with respect to distribution D1 implies learnability with respect to distribution D2.The answer to the above question depends on the learnability model. If the number of examples need not be bounded by a polynomial, it is sufficient to require that all sets which have zero probability with respect to D2 have zero probability with respect to d1. If the number of examples is required to be polynomial, then the probability with respect to D2 must be bounded by a multiplicative constant from that of D1. More stringent conditions must hold if we insist that every hypothesis consistent with the examples be close to the target.Finally, we address the learnability properties of classes of distributions.