A Further Comparison of Splitting Rules for Decision-Tree Induction
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
On the boosting ability of top-down decision tree learning algorithms
STOC '96 Proceedings of the twenty-eighth annual ACM symposium on Theory of computing
Learning in Neural Networks: Theoretical Foundations
Learning in Neural Networks: Theoretical Foundations
Inducing classification and regression trees in first order logic
Relational Data Mining
Relational rule induction with CPROGO14.4: a tutorial introductuon
Relational Data Mining
Classification of Individuals with Complex Structure
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Top-down induction of first-order logical decision trees
Artificial Intelligence
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Hi-index | 0.00 |
We study predicate selection functions (also known as splitting rules) for structural decision trees and propose two improvements to existing schemes. The first is in classification learning, where we reconsider the use of accuracy as a predicate selection function and show that, on practical grounds, it is a better alternative to other commonly used functions. The second is in regression learning, where we consider the standard mean squared error measure and give a predicate pruning result for it.