C4.5: programs for machine learning
C4.5: programs for machine learning
Learning decision tree classifiers
ACM Computing Surveys (CSUR)
BOAT—optimistic decision tree construction
SIGMOD '99 Proceedings of the 1999 ACM SIGMOD international conference on Management of data
Machine Learning
Elegant Decision Tree Algorithm for Classification in Data Mining
WISEW '02 Proceedings of the Third International Conference on Web Information Systems Engineering (Workshops) - (WISEw'02)
Data Mining: Concepts and Techniques
Data Mining: Concepts and Techniques
Decision Tree Algorithm based on Sampling
NPC '07 Proceedings of the 2007 IFIP International Conference on Network and Parallel Computing Workshops
Privacy preserving ID3 using Gini Index over horizontally partitioned data
AICCSA '08 Proceedings of the 2008 IEEE/ACS International Conference on Computer Systems and Applications
Hi-index | 0.00 |
Decision trees have been widely and successfully applied to data mining and machine learning for data classification. One of the critical components in a decision tree algorithm is the criterion used to select which attribute will become a test attribute in a given branch of the tree. Several algorithms, such as ID3, C4.5, and CART, are investigated and compared for determining attributes as the root and the branches in decision trees. This paper presents the issues of traditional ID3 algorithm that directly affect building decision trees. The goal is to select critical factors in ID3 algorithm that result in an efficient decision tree for classification applications. Examples are given to illustrate what factors may affect the construction of decision trees.