International Journal of Man-Machine Studies - Special Issue: Knowledge Acquisition for Knowledge-based Systems. Part 5
The Strength of Weak Learnability
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Bottom-up induction of oblivious read-once decision graphs
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Bottom-up induction of oblivious read-once decision graphs: strengths and limitations
AAAI '94 Proceedings of the twelfth national conference on Artificial intelligence (vol. 1)
Boosting a weak learning algorithm by majority
Information and Computation
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Structure in Approximation Classes
SIAM Journal on Computing
Some optimal inapproximability results
Journal of the ACM (JACM)
Complexity and Approximation: Combinatorial Optimization Problems and Their Approximability Properties
Computers and Intractability: A Guide to the Theory of NP-Completeness
Computers and Intractability: A Guide to the Theory of NP-Completeness
Predicting nearly as well as the best pruning of a planar decision graph
Theoretical Computer Science
Approximating the value of two power proof systems, with applications to MAX 2SAT and MAX DICUT
ISTCS '95 Proceedings of the 3rd Israel Symposium on the Theory of Computing Systems (ISTCS'95)
An analysis of reduced error pruning
Journal of Artificial Intelligence Research
Oblivious decision trees graphs and top down pruning
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Reduced error pruning of branching programs cannot be approximated to within a logarithmic factor
Information Processing Letters
Hi-index | 0.00 |
Induction of decision trees is one of the most successful approaches to supervised machine learning. Branching programs are a generalization of decision trees and, by the boosting analysis, exponentially more efficiently learnable than decision trees. However, this advantage has not been seen to materialize in experiments. Decision trees are easy to simplify using pruning. Reduced error pruning is one of the simplest decision tree pruning algorithms. For branching programs no pruning algorithms are known. In this paper we prove that reduced error pruning of branching programs is infeasible. Finding the optimal pruning of a branching program with respect to a set of pruning examples that is separate from the set of training examples is NP-complete. Because of this intractability result, we have to consider approximating reduced error pruning. Unfortunately, it turns out that even finding an approximate solution of arbitrary accuracy is computationally infeasible. In particular, reduced error pruning of branching programs is APX-hard. Our experiments show that, despite the negative theoretical results, heuristic pruning of branching programs can reduce their size without significantly altering the accuracy.