Efficient agnostic PAC-learning with simple hypothesis
COLT '94 Proceedings of the seventh annual conference on Computational learning theory
Concept learning with geometric hypotheses
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
On the boosting ability of top-down decision tree learning algorithms
STOC '96 Proceedings of the twenty-eighth annual ACM symposium on Theory of computing
An Exact Probability Metric for Decision Tree Splitting and Stopping
Machine Learning
Separate-and-Conquer Rule Learning
Artificial Intelligence Review
General and Efficient Multisplitting of Numerical Attributes
Machine Learning
Automatic Construction of Decision Trees from Data: A Multi-Disciplinary Survey
Data Mining and Knowledge Discovery
Families of splitting criteria for classification trees
Statistics and Computing
On the quest for easy-to-understand splitting rules
Data & Knowledge Engineering
An inductive learning method for medical diagnosis
Pattern Recognition Letters
A Symmetric Nearest Neighbor Learning Rule
EWCBR '00 Proceedings of the 5th European Workshop on Advances in Case-Based Reasoning
Texture Based Look-Ahead for Decision-Tree Induction
ICAPR '01 Proceedings of the Second International Conference on Advances in Pattern Recognition
A ``Top-Down and Prune'' Induction Scheme for Constrained Decision Committees
IDA '99 Proceedings of the Third International Symposium on Advances in Intelligent Data Analysis
Extracting decision trees from trained neural networks
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Simplifying decision trees: A survey
The Knowledge Engineering Review
Machine Learning
Dynamic feature selection for hardware prediction
Journal of Systems Architecture: the EUROMICRO Journal
Vague knowledge search in the design for outsourcing using fuzzy decision tree
Computers and Operations Research
Anytime Learning of Decision Trees
The Journal of Machine Learning Research
Knowledge and Information Systems
Disambiguating authors in academic publications using random forests
Proceedings of the 9th ACM/IEEE-CS joint conference on Digital libraries
Journal of Artificial Intelligence Research
A system for induction of oblique decision trees
Journal of Artificial Intelligence Research
CSNL: A cost-sensitive non-linear decision tree algorithm
ACM Transactions on Knowledge Discovery from Data (TKDD)
Pattern discovery in distributed databases
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
Learning with ensembles of randomized trees: new insights
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part III
Separability of split value criterion with weighted separation gains
MLDM'11 Proceedings of the 7th international conference on Machine learning and data mining in pattern recognition
Predicate selection for structural decision trees
ILP'05 Proceedings of the 15th international conference on Inductive Logic Programming
A combined neural network and decision trees model for prognosis of breast cancer relapse
Artificial Intelligence in Medicine
Hi-index | 0.00 |
One approach to learning classification rules from examples is to build decision trees. A review and comparison paper by Mingers (Mingers, 1989) looked at the first stage of tree building, which uses a “splitting rule” to grow trees with a greedy recursive partitioning algorithm. That paper considered a number of different measures and experimentally examined their behavior on four domains. The main conclusion was that a random splitting rule does not significantly decrease classificational accuracy. This note suggests an alternative experimental method and presents additional results on further domains. Our results indicate that random splitting leads to increased error. These results are at variance with those presented by Mingers.