C4.5: programs for machine learning
C4.5: programs for machine learning
A Comparative Analysis of Methods for Pruning Decision Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
A Practical Guide to Knowledge Acquisition
A Practical Guide to Knowledge Acquisition
Empirical Learning Aided by Weak Domain Knowledge in the Form of Feature Importance
CMSP '11 Proceedings of the 2011 International Conference on Multimedia and Signal Processing - Volume 01
Top-down induction of decision trees classifiers - a survey
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
Decision Tree algorithms cannot learn accurately with a small training set. This is because, decision tree algorithms recursively partition the data set that leaves very few instances in the lower levels of the tree. Additional domain knowledge has been shown to enhance the performance of learners. We present an algorithm named Importance Aided Decision Tree (IADT) that takes Feature Importance as an additional domain knowledge. Decision Tree algorithm always finds the most important attributes in each node. Thus, Feature Importance can be useful to Decision Tree learning. Our algorithm uses a novel approach to incorporate this feature importance score into decision tree learning. This approach makes decision trees more accurate and robust. We demonstrated theoretical and empirical performance analysis to show that IADT is superior to standard decision tree learning algorithms.