A Further Comparison of Splitting Rules for Decision-Tree Induction
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Technical note: some properties of splitting criteria
Machine Learning
Learning decision tree classifiers
ACM Computing Surveys (CSUR)
An Exact Probability Metric for Decision Tree Splitting and Stopping
Machine Learning
BOAT—optimistic decision tree construction
SIGMOD '99 Proceedings of the 1999 ACM SIGMOD international conference on Management of data
Classification and regression: money *can* grow on trees
KDD '99 Tutorial notes of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Algorithms for association rule mining — a general survey and comparison
ACM SIGKDD Explorations Newsletter
Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey
Data Mining and Knowledge Discovery
RainForest—A Framework for Fast Decision Tree Construction of Large Datasets
Data Mining and Knowledge Discovery
Families of splitting criteria for classification trees
Statistics and Computing
Machine Learning
SLIQ: A Fast Scalable Classifier for Data Mining
EDBT '96 Proceedings of the 5th International Conference on Extending Database Technology: Advances in Database Technology
PUBLIC: A Decision Tree Classifier that Integrates Building and Pruning
VLDB '98 Proceedings of the 24rd International Conference on Very Large Data Bases
SPRINT: A Scalable Parallel Classifier for Data Mining
VLDB '96 Proceedings of the 22th International Conference on Very Large Data Bases
On biases in estimating multi-valued attributes
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Building multi-way decision trees with numerical attributes
Information Sciences: an International Journal
Learning Vector Quantization with Training Data Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Class-oriented reduction of decision tree complexity
ISMIS'08 Proceedings of the 17th international conference on Foundations of intelligent systems
ComEnVprs: a novel approach for inducing decision tree classifiers
ADMA'06 Proceedings of the Second international conference on Advanced Data Mining and Applications
Hi-index | 0.00 |
Decision trees are probably the most popular and commonly used classification model. They are built recursively following a top-down approach (from general concepts to particular examples) by repeated splits of the training dataset. The chosen splitting criterion may affect the accuracy of the classifier, but not significantly. In fact, none of the proposed splitting criteria in the literature has proved to be universally better than the rest. Although they all yield similar results, their complexity varies significantly, and they are not always suitable for multi-way decision trees. Here we propose two new splitting rules which obtain similar results to other well-known criteria when used to build multi-way decision trees, while their simplicity makes them ideal for non-expert users.