A Further Comparison of Splitting Rules for Decision-Tree Induction
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Technical note: some properties of splitting criteria
Machine Learning
Mining needle in a haystack: classifying rare classes via two-phase rule induction
SIGMOD '01 Proceedings of the 2001 ACM SIGMOD international conference on Management of data
On the quest for easy-to-understand splitting rules
Data & Knowledge Engineering
Building multi-way decision trees with numerical attributes
Information Sciences: an International Journal
Intelligent Partitioning for Feature Selection
INFORMS Journal on Computing
Adaptive building of decision trees by reinforcement learning
AIC'07 Proceedings of the 7th Conference on 7th WSEAS International Conference on Applied Informatics and Communications - Volume 7
Decision trees: a recent overview
Artificial Intelligence Review
A hybrid decision tree classifier
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.00 |
Several splitting criteria for binary classification trees are shown to be written as weighted sums of two values of divergence measures. This weighted sum approach is then used to form two families of splitting criteria. One of them contains the chi-squared and entropy criterion, the other contains the mean posterior improvement criterion. Both family members are shown to have the property of exclusive preference. Furthermore, the optimal splits based on the proposed families are studied. We find that the best splits depend on the parameters in the families. The results reveal interesting differences among various criteria. Examples are given to demonstrate the usefulness of both families.