Maximizing the predictive value of production rules
Artificial Intelligence
The Use of Background Knowledge in Decision Tree Induction
Machine Learning
C4.5: programs for machine learning
C4.5: programs for machine learning
MetaCost: a general method for making classifiers cost-sensitive
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Machine Learning
Learning cost-sensitive active classifiers
Artificial Intelligence
Learning cost-sensitive diagnostic policies from data
Learning cost-sensitive diagnostic policies from data
Decision trees with minimal costs
ICML '04 Proceedings of the twenty-first international conference on Machine learning
"Missing Is Useful': Missing Values in Cost-Sensitive Decision Trees
IEEE Transactions on Knowledge and Data Engineering
Test-Cost Sensitive Classification on Data with Missing Values
IEEE Transactions on Knowledge and Data Engineering
Feature value acquisition in testing: a sequential batch test algorithm
ICML '06 Proceedings of the 23rd international conference on Machine learning
Test Strategies for Cost-Sensitive Decision Trees
IEEE Transactions on Knowledge and Data Engineering
Semi-parametric optimization for missing data imputation
Applied Intelligence
Cost-sensitive test strategies
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
Journal of Artificial Intelligence Research
GBKII: an imputation method for missing values
PAKDD'07 Proceedings of the 11th Pacific-Asia conference on Advances in knowledge discovery and data mining
Any-cost discovery: learning optimal classification rules
AI'05 Proceedings of the 18th Australian Joint conference on Advances in Artificial Intelligence
Cost-Sensitive decision trees with multiple cost scales
AI'04 Proceedings of the 17th Australian joint conference on Advances in Artificial Intelligence
Cost-sensitive classification with respect to waiting cost
Knowledge-Based Systems
A survey of cost-sensitive decision tree induction algorithms
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
Cost-sensitive decision tree learning is very important and popular in machine learning and data mining community. There are many literatures focusing on misclassification cost and test cost at present. In real world application, however, the issue of time-sensitive should be considered in costsensitive learning. In this paper, we regard the cost of time-sensitive in costsensitive learning as waiting cost (referred to WC), a novelty splitting criterion is proposed for constructing cost-time sensitive (denoted as CTS) decision tree for maximal decrease the intangible cost. And then, a hybrid test strategy that combines the sequential test with the batch test strategies is adopted in CTS learning. Finally, extensive experiments show that our algorithm outperforms the other ones with respect to decrease in misclassification cost.