Learning decision rules in noisy domains
Proceedings of Expert Systems '86, The 6Th Annual Technical Conference on Research and development in expert systems III
Information Processing Letters
Instance-Based Learning Algorithms
Machine Learning
Improved Estimates for the Accuracy of Small Disjuncts
Machine Learning
Learning hard concepts through constructive induction: framework and rationale
Computational Intelligence
C4.5: programs for machine learning
C4.5: programs for machine learning
Machine Learning
Learning Logical Definitions from Relations
Machine Learning
Machine Learning
Machine Learning
What Should be Minimized in a Decision Tre: A Re-examination
What Should be Minimized in a Decision Tre: A Re-examination
The Role of Occam‘s Razor in Knowledge Discovery
Data Mining and Knowledge Discovery
Phase Transitions and Stochastic Local Search in k-Term DNF Learning
ECML '02 Proceedings of the 13th European Conference on Machine Learning
From Ensemble Methods to Comprehensible Models
DS '02 Proceedings of the 5th International Conference on Discovery Science
Handbook of data mining and knowledge discovery
Instance-Based Regression by Partitioning Feature Projections
Applied Intelligence
Lookahead-based algorithms for anytime induction of decision trees
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Pareto-optimal patterns in logical analysis of data
Discrete Applied Mathematics - Discrete mathematics & data mining (DM & DM)
Anytime Learning of Decision Trees
The Journal of Machine Learning Research
Cleaning disguised missing data: a heuristic approach
Proceedings of the 13th ACM SIGKDD international conference on Knowledge discovery and data mining
DiMaC: a system for cleaning disguised missing data
Proceedings of the 2008 ACM SIGMOD international conference on Management of data
DiMaC: a disguised missing data cleaning tool
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Occam's Razor and a non-syntactic measure of decision tree complexity
AAAI'04 Proceedings of the 19th national conference on Artifical intelligence
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Tractable induction and classification in first order logic via stochastic matching
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
Decision tree grafting from the all-tests-but-one partition
IJCAI'99 Proceedings of the 16th international joint conference on Artificial intelligence - Volume 2
Occam's razor just got sharper
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Pareto-optimal patterns in logical analysis of data
Discrete Applied Mathematics
A meta-heuristic approach for improving the accuracy in some classification algorithms
Computers and Operations Research
Prediction in financial markets: The case for small disjuncts
ACM Transactions on Intelligent Systems and Technology (TIST)
Data-driven adaptive selection of rules quality measures for improving the rules induction algorithm
RSFDGrC'11 Proceedings of the 13th international conference on Rough sets, fuzzy sets, data mining and granular computing
Generality is predictive of prediction accuracy
Data Mining
Improving VG-RAM neural networks performance using knowledge correlation
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
CHIRA---Convex Hull Based Iterative Algorithm of Rules Aggregation
Fundamenta Informaticae
Redefinition of Decision Rules Based on the Importance of Elementary Conditions Evaluation
Fundamenta Informaticae
Hi-index | 0.01 |
This paper presents new experimental evidence against the utility of Occam's razor. A systematic procedure is presented for post-processing decision trees produced by C4.5. This procedure was derived by rejecting Occam's razor and instead attending to the assumption that similar objects are likely to belong to the same class. It increases a decision tree's complexity without altering the performance of that tree on the training data from which it is inferred. The resulting more complex decision trees are demonstrated to have, on average, for a variety of common learning tasks, higher predictive accuracy than the less complex original decision trees. This result raises considerable doubt about the utility of Occam's razor as it is commonly applied in modern machine learning.