MetaCost: a general method for making classifiers cost-sensitive
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Pruning Decision Trees with Misclassification Costs
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Learning Decision Trees Using the Area Under the ROC Curve
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
AdaCost: Misclassification Cost-Sensitive Boosting
ICML '99 Proceedings of the Sixteenth International Conference on Machine Learning
Exploiting the Cost (In)sensitivity of Decision Tree Splitting Criteria
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Inducing Cost-Sensitive Trees via Instance Weighting
PKDD '98 Proceedings of the Second European Symposium on Principles of Data Mining and Knowledge Discovery
An Evolutionary Algorithm for Cost-Sensitive Decision Rule Learning
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
Boosting Trees for Cost-Sensitive Classifications
ECML '98 Proceedings of the 10th European Conference on Machine Learning
Methods for cost-sensitive learning
Methods for cost-sensitive learning
Decision trees with minimal costs
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Training Cost-Sensitive Neural Networks with Methods Addressing the Class Imbalance Problem
IEEE Transactions on Knowledge and Data Engineering
Maximum profit mining and its application in software development
Proceedings of the 12th ACM SIGKDD international conference on Knowledge discovery and data mining
Data Mining
Journal of Artificial Intelligence Research
FSKD'05 Proceedings of the Second international conference on Fuzzy Systems and Knowledge Discovery - Volume Part II
Decision tree classifiers sensitive to heterogeneous costs
Journal of Systems and Software
Attribute reduction of data with error ranges and test costs
Information Sciences: an International Journal
Hi-index | 0.00 |
This paper explores two simple and efficient pre-pruning strategies for the cost-sensitive decision tree algorithm to avoid overfitting. One is to limit the cost-sensitive decision trees to a depth of two. The other is to prune the trees with a pre-specified threshold. Empirical study shows that, compared to the error-based tree algorithm C4.5 and several other cost-sensitive tree algorithms, the new cost-sensitive decision trees with pre-pruning are more efficient and perform well on most UCI data sets.