Bootstrap technology and applications
Technometrics
Machine Learning
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Statistical Models in S
On Bias, Variance, 0/1—Loss, and the Curse-of-Dimensionality
Data Mining and Knowledge Discovery
A comparison of some error estimates for neural network models
Neural Computation
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Non-parametric bootstrap ensembles for detection of tumor lesions
Pattern Recognition Letters
Mathematics and Computers in Simulation
Hi-index | 0.01 |
Bootstrapping is a simple technique typically used toassess accuracy of estimates of model parameters by using simpleplug-in principles and replacing sometimes unwieldy theory bycomputer simulation. Common uses include variance estimation andconfidence interval construction of model parameters. It alsoprovides a way to estimate prediction accuracy of continuous andclass-valued outcomes regression models. In this paper we willoverview some of these applications of the bootstrap focusing onbootstrap estimates of prediction error, and also explore how thebootstrap can be used to improve prediction accuracy of unstablemodels like tree-structured classifiers through aggregation. Theimprovements can typically be attributed to variance reduction in theclassical regression setting and more generally a smoothing ofdecision boundaries for the classification setting. Theseadvancements have important implications in the way that atmosphericprediction models can be improved, and illustrations of this will beshown. For class-valued outcomes, an interesting graphic known asthe CAT scan can be constructed to help understand the aggregateddecision boundary. This will be illustrated using simulated data.