The Use of Background Knowledge in Decision Tree Induction
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
MetaCost: a general method for making classifiers cost-sensitive
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Pruning Improves Heuristic Search for Cost-Sensitive Learning
ICML '02 Proceedings of the Nineteenth International Conference on Machine Learning
Inducing Cost-Sensitive Trees via Instance Weighting
PKDD '98 Proceedings of the Second European Symposium on Principles of Data Mining and Knowledge Discovery
Editorial: special issue on learning from imbalanced data sets
ACM SIGKDD Explorations Newsletter - Special issue on learning from imbalanced datasets
Decision trees with minimal costs
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Test-Cost Sensitive Naive Bayes Classification
ICDM '04 Proceedings of the Fourth IEEE International Conference on Data Mining
Journal of Artificial Intelligence Research
The foundations of cost-sensitive learning
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Exploring an improved decision tree based weights
ICNC'09 Proceedings of the 5th international conference on Natural computation
Cost-sensitive decision tree for uncertain data
ADMA'11 Proceedings of the 7th international conference on Advanced Data Mining and Applications - Volume Part I
Cost-sensitive decision trees applied to medical data
DaWaK'07 Proceedings of the 9th international conference on Data Warehousing and Knowledge Discovery
The CASH algorithm-cost-sensitive attribute selection using histograms
Information Sciences: an International Journal
A survey of cost-sensitive decision tree induction algorithms
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
Cost-sensitive decision tree and cost-sensitive naïve Bayes are both new cost-sensitive learning models proposed recently to minimize the total cost of test and misclassifications. Each of them has its advantages and disadvantages. In this paper, we propose a novel cost-sensitive learning model, a hybrid cost-sensitive decision tree, called DTNB, to reduce the minimum total cost, which integrates the advantages of cost-sensitive decision tree and of the cost-sensitive naïve Bayes together. We empirically evaluate it over various test strategies, and our experiments show that our DTNB outperforms cost-sensitive decision and the cost-sensitive naïve Bayes significantly in minimizing the total cost of tests and misclassification based on the same sequential test strategies, and single batch strategies.