Communications of the ACM
Microchoice bounds and self bounding learning algorithms
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
A Theory of Learning and Generalization: With Applications to Neural Networks and Control Systems
Machine Learning
Generalization Bounds for Decision Trees
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Tutorial on Practical Prediction Theory for Classification
The Journal of Machine Learning Research
A comparison of tight generalization error bounds
ICML '05 Proceedings of the 22nd international conference on Machine learning
Hi-index | 0.00 |
Error bounds for decision trees are generally based on depth or breadth of the tree. In this paper, we propose a bound for error rate that depends both on the depth and the breadth of a specific decision tree constructed from the training samples. This bound is derived from sample complexity estimate based on PAC learnability. The proposed bound is compared with other traditional error bounds on several machine learning benchmark data sets as well as on an image data set used in Content Based Image Retrieval (CBIR). Experimental results demonstrate that the proposed bound gives tighter estimation of the empirical error.