C4.5: programs for machine learning
C4.5: programs for machine learning
An Instance-Weighting Method to Induce Cost-Sensitive Trees
IEEE Transactions on Knowledge and Data Engineering
Decision trees with minimal costs
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Training Cost-Sensitive Neural Networks with Methods Addressing the Class Imbalance Problem
IEEE Transactions on Knowledge and Data Engineering
Representing conditional independence using decision trees
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 2
The foundations of cost-sensitive learning
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Improving the ranking performance of decision trees
ECML'06 Proceedings of the 17th European conference on Machine Learning
Hybrid cost-sensitive decision tree
PKDD'05 Proceedings of the 9th European conference on Principles and Practice of Knowledge Discovery in Databases
Hi-index | 0.00 |
Although Decision Tree learning has achieved great success in building classifier, most existing methods don't pay attention to unequal weights between different instances from training and testing data sets. However, many real world data sets are imbalanced in nature. In this paper, we introduce a new improved decision tree based weights, which considers imbalanced weights between different instances, to address the class imbalanced problems. The proposed decision tree algorithm is simple and more effective in implementation than previous decision trees. Also, the new proposed algorithm will be compared with C4.5 (a novel decision tree algorithm) experimentally and the experiment results testify that our proposed algorithm outperforms C4.5 significantly, in terms of the improvement of the classification accuracy in UCI data sets.