Comparing the Bayes and Typicalness Frameworks
EMCL '01 Proceedings of the 12th European Conference on Machine Learning
Inductive Confidence Machines for Regression
ECML '02 Proceedings of the 13th European Conference on Machine Learning
Functional Models for Regression Tree Leaves
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
SECRET: a scalable linear regression tree algorithm
Proceedings of the eighth ACM SIGKDD international conference on Knowledge discovery and data mining
Top-Down Induction of Model Trees with Regression and Splitting Nodes
IEEE Transactions on Pattern Analysis and Machine Intelligence
UAI'98 Proceedings of the Fourteenth conference on Uncertainty in artificial intelligence
Hi-index | 0.00 |
Many problems encountered in practice involve the prediction of a continuous attribute associated with an example. This problem, known as regression, requires that samples of past experience with known continuous answers are examined and generalized in a regression model to be used in predicting future examples. Regression algorithms deeply investigated in statistics, machine learning and data mining usually lack measures to give an indication of how “good” the predictions are. Tolerance regions, i.e., a range of possible predictive values, can provide a measure of reliability for every bare prediction. In this paper, we focus on tree-based prediction models, i.e., model trees, and resort to the inductive inference to output tolerance regions in addition to bare prediction. In particular, we consider model trees mined by SMOTI (Stepwise Model Tree Induction) that is a system for data-driven stepwise construction of model trees with regression and splitting nodes and we extend the definition of trees to build tolerance regions to be associated with each leaf. Experiments evaluate validity and quality of output tolerance regions.