Computational limitations of small-depth circuits
Computational limitations of small-depth circuits
Learning k-DNF with noise in the attributes
COLT '88 Proceedings of the first annual workshop on Computational learning theory
General bounds on the number of examples needed for learning probabilistic concepts
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Constant depth circuits, Fourier transform, and learnability
Journal of the ACM (JACM)
On learning from noisy and incomplete examples
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
On Learning Correlated Boolean Functions Using Statistical Queries
ALT '01 Proceedings of the 12th International Conference on Algorithmic Learning Theory
On Learning Monotone Boolean Functions
FOCS '98 Proceedings of the 39th Annual Symposium on Foundations of Computer Science
Learning DNF from random walks
Journal of Computer and System Sciences - Special issue: Learning theory 2003
Classification algorithm sensitivity to training data with non representative attribute noise
Decision Support Systems
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Learning juntas in the presence of noise
TAMC'06 Proceedings of the Third international conference on Theory and Applications of Models of Computation
Hi-index | 0.00 |
We study the problem of PAC-learning Boolean functions with random attribute noise under the uniform distribution. We define a noisy distance measure for function classes and show that if this measure is small for a class C and an attribute noise distribution D then C is not learnable with respect to the uniform distribution in the presence of noise generated according to D. The noisy distance measure is then characterized in terms of Fourier properties of the function class. We use this characterization to show that the class of all parity functions is not learnable for any but very concentrated noise distributions D. On the other hand, we show that if C is learnable with respect to uniform using a standard Fourier-based learning technique, then C is learnable with time and sample complexity also determined by the noisy distance. In fact, we show that this style algorithm is nearly the best possible for learning in the presence of attribute noise. As an application of our results, we show how to extend such an algorithm for learning AC0 so that it handles certain types of attribute noise with relatively little impact on the running time.