Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Boosting the margin: A new explanation for the effectiveness of voting methods
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
Improving Regressors using Boosting Techniques
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Jump process for the trend estimation of time series
Computational Statistics & Data Analysis
Cross-validated bagged learning
Journal of Multivariate Analysis
Improving analogy-based software cost estimation by a resampling method
Information and Software Technology
Evolving an Ensemble of Neural Networks Using Artificial Immune Systems
SEAL '08 Proceedings of the 7th International Conference on Simulated Evolution and Learning
Bagging for Gaussian process regression
Neurocomputing
Boosting and instability for regression trees
Computational Statistics & Data Analysis
Ensemble classification based on generalized additive models
Computational Statistics & Data Analysis
A new ensemble method for gold mining problems: Predicting technology transfer
Electronic Commerce Research and Applications
Ensemble approaches for regression: A survey
ACM Computing Surveys (CSUR)
Degrees of freedom and model selection in semiparametric additive monotone regression
Journal of Multivariate Analysis
Hi-index | 0.00 |
Recently, many authors have proposed new algorithms to improve the accuracy of certain classifiers by assembling a collection of individual classifiers obtained resampling on the training sample. Bagging and boosting are well-known methods in the machine learning context and they have been proved to be successful in classification problems. In the regression context, the application of these techniques has received little investigation. Our aim is to analyse, by simulation studies, when boosting and bagging can reduce the training set error and the generalization error, using nonparametric regression methods as predictors. In this work, we will consider three methods: projection pursuit regression (PPR), multivariate adaptive regression splines (MARS), local learning based on recursive covering (DART).