Information Processing Letters
Learning decision trees from random examples needed for learning
Information and Computation
Exact learning Boolean functions via the monotone theory
Information and Computation
Learning decision tree classifiers
ACM Computing Surveys (CSUR)
Partial Occam's razor and its applications
Information Processing Letters
Exact learning when irrelevant variables abound
Information Processing Letters
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Machine Learning
Simple learning algorithms for decision trees and multivariate polynomials
FOCS '95 Proceedings of the 36th Annual Symposium on Foundations of Computer Science
On some central problems in computational complexity.
On some central problems in computational complexity.
Hi-index | 5.23 |
Decision trees are popular representations of Boolean functions. We show that, given an alternative representation of a Boolean function f, say as a read-once branching program, one can find a decision tree T which approximates f to any desired amount of accuracy. Moreover, the size of the decision tree is at most that of the smallest decision tree which can represent f and this construction can be obtained in quasi-polynomial time. We also extend this result to the case where one has access only to a source of random evaluations of the Boolean function f instead of a complete representation. In this case, we show that a similar approximation can be obtained with any specified amount of confidence (as opposed to the absolute certainty of the former case.) This latter result implies proper PAC-learnability of decision trees under the uniform distribution without using membership queries.