Machine Learning
Decision Tree Induction Based on Efficient Tree Restructuring
Machine Learning
A framework for generating data to simulate changing environments
AIAP'07 Proceedings of the 25th conference on Proceedings of the 25th IASTED International Multi-Conference: artificial intelligence and applications
Boosting classifiers for drifting concepts
Intelligent Data Analysis - Knowlegde Discovery from Data Streams
Creating non-parametric bootstrap samples using Poisson frequencies
Computer Methods and Programs in Biomedicine
Improving generalization performance of bagging ensemble via Bayesian approach
CIRA'09 Proceedings of the 8th IEEE international conference on Computational intelligence in robotics and automation
Leveraging bagging for evolving data streams
ECML PKDD'10 Proceedings of the 2010 European conference on Machine learning and knowledge discovery in databases: Part I
Hi-index | 0.00 |
Bagging frequently improves the predictive performance of a model. An online version has recently been introduced, which attempts to gain the benefits of an online algorithm while approximating regular bagging. However, regular online bagging is an approximation to its batch counterpart and so is not lossless with respect to the bagging operation. By operating under the Bayesian paradigm, we introduce an online Bayesian version of bagging which is exactly equivalent to the batch Bayesian version, and thus when combined with a lossless learning algorithm gives a completely lossless online bagging algorithm. We also note that the Bayesian formulation resolves a theoretical problem with bagging, produces less variability in its estimates, and can improve predictive performance for smaller data sets.