The Use of Background Knowledge in Decision Tree Induction
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Cost-sensitive pruning of decision trees
ECML-94 Proceedings of the European conference on machine learning on Machine Learning
Machine learning, neural and statistical classification
Machine learning, neural and statistical classification
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Inducing Cost-Sensitive Trees via Instance Weighting
PKDD '98 Proceedings of the Second European Symposium on Principles of Data Mining and Knowledge Discovery
PRICAI '96 Proceedings of the 4th Pacific Rim International Conference on Artificial Intelligence: Topics in Artificial Intelligence
Boosting Trees for Cost-Sensitive Classifications
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Journal of Artificial Intelligence Research
Generating better decision trees
IJCAI'89 Proceedings of the 11th international joint conference on Artificial intelligence - Volume 1
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
An Empirical Study of MetaCost Using Boosting Algorithms
ECML '00 Proceedings of the 11th European Conference on Machine Learning
FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part II
A survey of cost-sensitive decision tree induction algorithms
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
This paper explores two techniques for boosting cost-sensitive trees. The two techniques differ in whether the misclassification cost information is utilized during training. We demonstrate that each of these techniques is good at different aspects of cost-sensitive classifications. We also show that both techniques provide a means to overcome the weaknesses of their base cost-sensitive tree induction algorithm.