Communications of the ACM
The Strength of Weak Learnability
Machine Learning
Equivalence of models for polynomial learnability
Information and Computation
Cryptographic primitives based on hard learning problems
CRYPTO '93 Proceedings of the 13th annual international cryptology conference on Advances in cryptology
Toward Efficient Agnostic Learning
Machine Learning - Special issue on computational learning theory, COLT'92
On the boosting ability of top-down decision tree learning algorithms
STOC '96 Proceedings of the twenty-eighth annual ACM symposium on Theory of computing
Efficient noise-tolerant learning from statistical queries
Journal of the ACM (JACM)
On the boosting ability of top-down decision tree learning algorithms
Journal of Computer and System Sciences
Learning Polynomials with Queries: The Highly Noisy Case
SIAM Journal on Discrete Mathematics
A sieve algorithm for the shortest lattice vector problem
STOC '01 Proceedings of the thirty-third annual ACM symposium on Theory of computing
Noise-tolerant learning, the parity problem, and the statistical query model
Journal of the ACM (JACM)
Optimally-smooth adaptive boosting and application to agnostic learning
The Journal of Machine Learning Research
On lattices, learning with errors, random linear codes, and cryptography
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
Learning nonsingular phylogenies and hidden Markov models
Proceedings of the thirty-seventh annual ACM symposium on Theory of computing
Agnostically Learning Halfspaces
FOCS '05 Proceedings of the 46th Annual IEEE Symposium on Foundations of Computer Science
Boosting in the presence of noise
Journal of Computer and System Sciences - Special issue: Learning theory 2003
New Results for Learning Noisy Parities and Halfspaces
FOCS '06 Proceedings of the 47th Annual IEEE Symposium on Foundations of Computer Science
APPROX'05/RANDOM'05 Proceedings of the 8th international workshop on Approximation, Randomization and Combinatorial Optimization Problems, and Proceedings of the 9th international conference on Randamization and Computation: algorithms and techniques
Authenticating pervasive devices with human protocols
CRYPTO'05 Proceedings of the 25th annual international conference on Advances in Cryptology
Extracting Computational Entropy and Learning Noisy Linear Functions
COCOON '09 Proceedings of the 15th Annual International Conference on Computing and Combinatorics
Bounding the average sensitivity and noise sensitivity of polynomial threshold functions
Proceedings of the forty-second ACM symposium on Theory of computing
Hardness of Reconstructing Multivariate Polynomials over Finite Fields
SIAM Journal on Computing
Hi-index | 0.00 |
The motivating problem is agnostically learning parity functions, i.e., parity with arbitrary or adversarial noise. Specifically, given random labeled examples from an *arbitrary* distribution, we would like to produce an hypothesis whose accuracy nearly matches the accuracy of the best parity function. Our algorithm runs in time 2O(n/log n), which matches the best known for the easier cases of learning parities with random classification noise (Blum et al, 2003) and for agnostically learning parities over the uniform distribution on inputs (Feldman et al, 2006). Our approach is as follows. We give an agnostic boosting theorem that is capable of nearly achieving optimal accuracy, improving upon earlier studies (starting with Ben David et al, 2001). To achieve this, we circumvent previous lower bounds by altering the boosting model. We then show that the (random noise) parity learning algorithm of Blum et al (2000) fits our new model of agnostic weak learner. Our agnostic boosting framework is completely general and may be applied to other agnostic learning problems. Hence, it also sheds light on the actual difficulty of agnostic learning by showing that full agnostic boosting is indeed possible.