Crytographic limitations on learning Boolean formulae and finite automata
STOC '89 Proceedings of the twenty-first annual ACM symposium on Theory of computing
Redundant noisy attributes, attribute errors, and linear-threshold learning using winnow
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
Learning in the presence of malicious errors
SIAM Journal on Computing
Statistical queries and faulty PAC oracles
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Fat-shattering and the learnability of real-valued functions
Journal of Computer and System Sciences
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Learning nested differences in the presence of malicious noise
Theoretical Computer Science - Special issue on algorithmic learning theory
General convergence results for linear discriminant updates
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Large margin classification using the perceptron algorithm
COLT' 98 Proceedings of the eleventh annual conference on Computational learning theory
Specification and simulation of statistical query algorithms for efficiency and noise tolerance
Journal of Computer and System Sciences - Special issue on the eighth annual workshop on computational learning theory, July 5–8, 1995
Learning conjuctions with noise under product distributions
Information Processing Letters
Generalization performance of support vector machines and other pattern classifiers
Advances in kernel methods
The robustness of the p-norm algorithms
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
An adaptive version of the boost by majority algorithm
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Sample-efficient strategies for learning in the presence of noise
Journal of the ACM (JACM)
On-line learning with malicious noise and the closure algorithm
Annals of Mathematics and Artificial Intelligence
EuroCOLT '99 Proceedings of the 4th European Conference on Computational Learning Theory
PAC Analogues of Perceptron and Winnow via Boosting the Margin
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Hard-core distributions for somewhat hard problems
FOCS '95 Proceedings of the 36th Annual Symposium on Foundations of Computer Science
FOCS '99 Proceedings of the 40th Annual Symposium on Foundations of Computer Science
Learning disjunction of conjunctions
IJCAI'85 Proceedings of the 9th international joint conference on Artificial intelligence - Volume 1
Quantum DNF Learnability Revisited
COCOON '02 Proceedings of the 8th Annual International Conference on Computing and Combinatorics
Optimally-Smooth Adaptive Boosting and Application to Agnostic Learning
ALT '02 Proceedings of the 13th International Conference on Algorithmic Learning Theory
Smooth boosting using an information-based criterion
ALT'06 Proceedings of the 17th international conference on Algorithmic Learning Theory
Hi-index | 0.00 |
We describe a new boosting algorithm which generates only smooth distributions which do not assign too much weight to any single example. We show that this new boosting algorithm can be used to construct efficient PAC learning algorithms which tolerate relatively high rates of malicious noise. In particular, we use the new smooth boosting algorithm to construct malicious noise tolerant versions of the PAC-model p-norm linear threshold learning algorithms described in [23]. The bounds on sample complexity and malicious noise tolerance of these new PAC algorithms closely correspond to known bounds for the online p- norm algorithms of Grove, Littlestone and Schuurmans [14] and Gentile and Littlestone [13]. As special cases of our new algorithms we obtain linear threshold learning algorithms which match the sample complexity and malicious noise tolerance of the online Perceptron and Winnow algorithms. Our analysis reveals an interesting connection between boosting and noise tolerance in the PAC setting.