Communications of the ACM
On the learnability of Boolean formulae
STOC '87 Proceedings of the nineteenth annual ACM symposium on Theory of computing
Learning decision trees from random examples needed for learning
Information and Computation
Prediction-preserving reducibility
Journal of Computer and System Sciences - 3rd Annual Conference on Structure in Complexity Theory, June 14–17, 1988
Rank-r decision trees are a subclass of r-decision lists
Information Processing Letters
A subexponential exact learning algorithm for DNF using equivalence queries
Information Processing Letters
A new composition theorem for learning algorithms
STOC '98 Proceedings of the thirtieth annual ACM symposium on Theory of computing
Efficient learning with virtual threshold gates
Information and Computation
Machine Learning
Machine Learning
Learning DNF by Approximating Inclusion-Exclusion Formulae
COCO '99 Proceedings of the Fourteenth Annual IEEE Conference on Computational Complexity
Journal of Computer and System Sciences - STOC 2001
Hi-index | 0.00 |
The Composition Lemma is one of the strongest tools for learning complex classes. It shows that if a class is learnable then composing the class with a class of polynomial number of concepts gives a learnable class. In this paper we extend the Composition Lemma as follows: we show that composing an attribute efficient learnable class with a learnable class with polynomial shatter coefficient gives a learnable class. This result extends many results in the literature and gives polynomial learning algorithms for new classes.