Communications of the ACM
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
Learnability with respect to fixed distributions
Theoretical Computer Science
Learning with a slowly changing distribution
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Characterizations of learnability for classes of {O, …, n}-valued functions
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Toward efficient agnostic learning
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Bounding sample size with the Vapnik-Chervonenkis dimension
Discrete Applied Mathematics
Decision theoretic generalizations of the PAC model for neural net and other learning applications
Information and Computation
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
On the complexity of function learning
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Simulating access to hidden information while learning
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Valid generalisation of functions from close approximations on a sample
Euro-COLT '93 Proceedings of the first European conference on Computational learning theory
On Learning Sets and Functions
Machine Learning
On randomized one-round communication complexity
STOC '95 Proceedings of the twenty-seventh annual ACM symposium on Theory of computing
From noise-free to noise-tolerant and from on-line to batch learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
On efficient agnostic learning of linear combinations of basis functions
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
More theorems about scale-sensitive dimensions and learning
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
VC dimension of an integrate-and-fire neuron model
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
A Bayesian/information theoretic model of bias learning
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
A Bayesian/Information Theoretic Model of Learning to Learn viaMultiple Task Sampling
Machine Learning - Special issue on inductive transfer
Probabilistic ’generalization‘ of functions and dimension-based uniform convergence results
Statistics and Computing
Function Learning from Interpolation
Combinatorics, Probability and Computing
Vapnik-chervonenkis generalization bounds for real valued neural networks
Neural Computation
Vc dimension of an integrate-and-fire neuron model
Neural Computation
Differential privacy and the fat-shattering dimension of linear queries
APPROX/RANDOM'10 Proceedings of the 13th international conference on Approximation, and 14 the International conference on Randomization, and combinatorial optimization: algorithms and techniques
Hi-index | 0.00 |
We consider the problem of learning real-valued functions from random examples when the function values are corrupted with noise. With mild conditions on independent observation noise, we provide characterizations of the learnability of a real-valued function class in terms of a generalization of the Vapnik-Chervonenkis dimension, the fat shattering function, introduced by Kearns and Schapire. We show that, given some restrictions on the noise, a function class is learnable in our model if and only if its fat-shattering function is finite. With different (also quite mild) restrictions, satisfied for example by gaussian noise, we show that a function class is learnable from polynomially many examples if and only if its fat-shattering function grows polynomially. We prove analogous results in an agnostic setting, where there is no assumption of an underlying function class.