Information Processing Letters
International Journal of Man-Machine Studies - Special Issue: Knowledge Acquisition for Knowledge-based Systems. Part 5
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
C4.5: programs for machine learning
C4.5: programs for machine learning
Trading Accuracy for Simplicity in Decision Trees
Machine Learning
An efficient algorithm for optimal pruning of decision trees
Artificial Intelligence
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Predicting Nearly As Well As the Best Pruning of a Decision Tree
Machine Learning - Special issue on the eighth annual conference on computational learning theory, (COLT '95)
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
On the Difficulty of Designing Good Classifiers
SIAM Journal on Computing
Combining Trainig Set and Test Set Bounds
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Pessimistic decision tree pruning based Continuous-time
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A Fast, Bottom-Up Decision Tree Pruning Algorithm with Near-Optimal Generalization
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Computable Shell Decomposition Bounds
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Localized Rademacher Complexities
COLT '02 Proceedings of the 15th Annual Conference on Computational Learning Theory
Progressive rademacher sampling
Eighteenth national conference on Artificial intelligence
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
An analysis of reduced error pruning
Journal of Artificial Intelligence Research
Rademacher penalties and structural risk minimization
IEEE Transactions on Information Theory
Improving phone duration modelling using support vector regression fusion
Speech Communication
Feature selection for improved phone duration modeling of greek emotional speech
SETN'10 Proceedings of the 6th Hellenic conference on Artificial Intelligence: theories, models and applications
Hi-index | 0.00 |
Rademacher penalization is a modern technique for obtaining data-dependent bounds on the generalization error of classifiers. It appears to be limited to relatively simple hypothesis classes because of computational complexity issues. In this paper we, nevertheless, apply Rademacher penalization to the in practice important hypothesis class of unrestricted decision trees by considering the prunings of a given decision tree rather than the tree growing phase. This study constitutes the first application of Rademacher penalization to hypothesis classes that have practical significance. We present two variations of the approach, one in which the hypothesis class consists of all prunings of the initial tree and another in which only the prunings that are accurate on growing data are taken into account. Moreover, we generalize the error-bounding approach from binary classification to multi-class situations. Our empirical experiments indicate that the proposed new bounds outperform distribution-independent bounds for decision tree prunings and provide non-trivial error estimates on real-world data sets.