An Iterative Growing and Pruning Algorithm for Classification Tree Design
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural networks and the bias/variance dilemma
Neural Computation
Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
C4.5: programs for machine learning
C4.5: programs for machine learning
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Machine Learning
Globally Optimal Fuzzy Decision Trees for Classification and Regression
IEEE Transactions on Pattern Analysis and Machine Intelligence
Randomizing Outputs to Increase Prediction Accuracy
Machine Learning
Machine Learning
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
A global optimization algorithm is designed to find the parameters of a CART regression tree extended with linear predictors at its leaves. In order to render the optimization mathematically feasible, the internal decisions of the CART tree are made continuous. This is accomplished by the replacement of the crisp decisions at the internal nodes of the tree with soft ones. The algorithm then adjusts the parameters of the tree in a manner similar to the backpropagation algorithm in multilayer perceptrons. With this procedure it is possible to generate regression trees optimized with a global cost function, which give a continuous representation of the unknown function, and whose architecture is automatically fixed by the data. The integration in one decision system of complementary features of symbolic and connectionist methods leads to improvements in prediction efficiency in both synthetic and real-world regression problems.