Boosting a weak learning algorithm by majority
COLT '90 Proceedings of the third annual workshop on Computational learning theory
Learning k-DNF with noise in the attributes
COLT '88 Proceedings of the first annual workshop on Computational learning theory
When won't membership queries help?
STOC '91 Proceedings of the twenty-third annual ACM symposium on Theory of computing
Learning decision trees using the Fourier spectrum
STOC '91 Proceedings of the twenty-third annual ACM symposium on Theory of computing
Toward efficient agnostic learning
COLT '92 Proceedings of the fifth annual workshop on Computational learning theory
Exact identification of read-once formulas using fixed points of amplification functions
SIAM Journal on Computing
Efficient noise-tolerant learning from statistical queries
STOC '93 Proceedings of the twenty-fifth annual ACM symposium on Theory of computing
Weakly learning DNF and characterizing statistical query learning using Fourier analysis
STOC '94 Proceedings of the twenty-sixth annual ACM symposium on Theory of computing
Uniform-distribution attribute noise learnability
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Noise-tolerant learning, the parity problem, and the statistical query model
STOC '00 Proceedings of the thirty-second annual ACM symposium on Theory of computing
Machine Learning
Learning by extended statistical queries and its relation to PAC learning
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Learning with Queries Corrupted by Classification Noise
ISTCS '97 Proceedings of the Fifth Israel Symposium on the Theory of Computing Systems (ISTCS '97)
Constant depth circuits, Fourier transform, and learnability
SFCS '89 Proceedings of the 30th Annual Symposium on Foundations of Computer Science
General bounds on statistical query learning and PAC learning with noise via hypothesis boosting
SFCS '93 Proceedings of the 1993 IEEE 34th Annual Foundations of Computer Science
An efficient membership-query algorithm for learning DNF with respect to the uniform distribution
SFCS '94 Proceedings of the 35th Annual Symposium on Foundations of Computer Science
Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
Hi-index | 0.00 |
The Kushilevitz-Mansour (KM)algorithm is an algorithm that finds all the "heavy" Fourier coefficients of a boolean function. It is the main tool for learning decision trees and DNF expressions in the PAC model with respect to the uniform distribution. The algorithm requires an access to the membership query (MQ)oracle. We weaken this requirement by producing an analogue of the KM algorithm that uses extended statistical queries (SQ)(SQs in which the expectation is taken with respect to a distribution given by a learning algorithm). We restrict a set of distributions that a learning algorithm may use for its SQs to be a set of specific constant bounded product distributions. Our analogue finds all the "heavy" Fourier coefficients of degree lower than c log n (we call it BS). We use BS to learn decision trees and by adapting Freund's boosting technique we give algorithm that learns DNF in this model. Learning in this model implies learning with persistent classification noise and in some cases can be extended to learning with product attribute noise. We develop a characterization for learnability with these extended SQs and apply it to get several negative results about the model.