Designing Storage Efficient Decision Trees
IEEE Transactions on Computers
C4.5: programs for machine learning
C4.5: programs for machine learning
Deliberation scheduling for problem solving in time-constrained environments
Artificial Intelligence
The nature of statistical learning theory
The nature of statistical learning theory
Evaluating a learning environment for case-based argumentation skills
Proceedings of the 6th international conference on Artificial intelligence and law
Machine Learning
Machine Learning
Anytime Heuristic Searc: First Results TITLE2:
Anytime Heuristic Searc: First Results TITLE2:
Extracting comprehensible models from trained neural networks
Extracting comprehensible models from trained neural networks
Lookahead-based algorithms for anytime induction of decision trees
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Price prediction and insurance for online auctions
Proceedings of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining
Machine Learning and Data Mining for Computer Security: Methods and Applications (Advanced Information and Knowledge Processing)
Can we learn to beat the best stock
Journal of Artificial Intelligence Research
Skewing: an efficient alternative to lookahead for decision tree induction
IJCAI'03 Proceedings of the 18th international joint conference on Artificial intelligence
Predicting breast cancer survivability: a comparison of three data mining methods
Artificial Intelligence in Medicine
Machine learning for medical diagnosis: history, state of the art and perspective
Artificial Intelligence in Medicine
Anytime induction of low-cost, low-error classifiers: a sampling-based approach
Journal of Artificial Intelligence Research
Hi-index | 0.00 |
The majority of the existing algorithms for learning decision trees are greedy--a tree is induced top-down, making locally optimal decisions at each node. In most cases, however, the constructed tree is not globally optimal. Furthermore, the greedy algorithms require a fixed amount of time and are not able to generate a better tree if additional time is available. To overcome this problem. we present a lookahead-based algorithm for anytime induction of decision trees which allows trading computational speed for tree quality. The algorithm uses a novel strategy for evaluating candidate splits; a stochastic version of ID3 is repeatedly invoked to estimate the size of the tree in which each split results, and the split that minimizes the expected size is preferred. Experimental results indicate that for several hard concepts, our proposed approach exhibits good anytime behavior and yields significantly better decision trees when more time is available.