Introduction to algorithms
The Strength of Weak Learnability
Machine Learning
A constructive induction framework
Proceedings of the sixth international workshop on Machine learning
Boosting a weak learning algorithm by majority
Information and Computation
Dynamic path-based branch correlation
Proceedings of the 28th annual international symposium on Microarchitecture
Alternative implementations of hybrid branch predictors
Proceedings of the 28th annual international symposium on Microarchitecture
Machine Learning
Evidence-based static branch prediction using machine learning
ACM Transactions on Programming Languages and Systems (TOPLAS)
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
A language for describing predictors and its application to automatic synthesis
Proceedings of the 24th annual international symposium on Computer architecture
Decision Tree Induction Based on Efficient Tree Restructuring
Machine Learning
The YAGS branch prediction scheme
MICRO 31 Proceedings of the 31st annual ACM/IEEE international symposium on Microarchitecture
Boosting in the limit: maximizing the margin of learned ensembles
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
The application of AdaBoost for distributed, scalable and on-line learning
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Improving branch predictors by correlating on data values
Proceedings of the 32nd annual ACM/IEEE international symposium on Microarchitecture
Automated design of finite state machine predictors for customized processors
ISCA '01 Proceedings of the 28th annual international symposium on Computer architecture
Incremental Induction of Decision Trees
Machine Learning
Machine Learning
Boolean Formula-Based Branch Prediction for Future Technologies
Proceedings of the 2001 International Conference on Parallel Architectures and Compilation Techniques
MadaBoost: A Modification of AdaBoost
COLT '00 Proceedings of the Thirteenth Annual Conference on Computational Learning Theory
Dynamic Branch Prediction with Perceptrons
HPCA '01 Proceedings of the 7th International Symposium on High-Performance Computer Architecture
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 1
Dynamic Weighted Majority: A New Ensemble Method for Tracking Concept Drift
ICDM '03 Proceedings of the Third IEEE International Conference on Data Mining
Immune network based ensembles
Neurocomputing
Nonlinear Boosting Projections for Ensemble Construction
The Journal of Machine Learning Research
Dynamic Weighted Majority: An Ensemble Method for Drifting Concepts
The Journal of Machine Learning Research
Boosting random subspace method
Neural Networks
Supervised projection approach for boosting classifiers
Pattern Recognition
Algorithm of Neural Network Ensembles and Robust Learning
ISNN '09 Proceedings of the 6th International Symposium on Neural Networks on Advances in Neural Networks
AAAI'04 Proceedings of the 19th national conference on Artifical intelligence
IEEE Transactions on Neural Networks
Constructing ensembles of classifiers by means of weighted instance selection
IEEE Transactions on Neural Networks
Using diversity to handle concept drift in on-line learning
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Incremental learning with multiple classifier systems using correction filters for classification
IDA'07 Proceedings of the 7th international conference on Intelligent data analysis
Incremental learning with multi-level adaptation
Neurocomputing
IWANN'05 Proceedings of the 8th international conference on Artificial Neural Networks: computational Intelligence and Bioinspired Systems
An ensemble method for incremental classification in stationary and non-stationary environments
CIARP'11 Proceedings of the 16th Iberoamerican Congress conference on Progress in Pattern Recognition, Image Analysis, Computer Vision, and Applications
An instance-window based classification algorithm for handling gradual concept drifts
ADMI'11 Proceedings of the 7th international conference on Agents and Data Mining Interaction
Information Sciences: an International Journal
A survey on concept drift adaptation
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
We study resource-limited online learning, motivated by the problem of conditional-branch outcome prediction in computer architecture. In particular, we consider (parallel) time and space-efficient ensemble learners for online settings, empirically demonstrating benefits similar to those shown previously for offline ensembles. Our learning algorithms are inspired by the previously published “boosting by filtering” framework as well as the offline Arc-x4 boosting-style algorithm. We train ensembles of online decision trees using a novel variant of the ID4 online decision-tree algorithm as the base learner, and show empirical results for both boosting and bagging-style online ensemble methods. Our results evaluate these methods on both our branch prediction domain and online variants of three familiar machine-learning benchmarks. Our data justifies three key claims. First, we show empirically that our extensions to ID4 significantly improve performance for single trees and additionally are critical to achieving performance gains in tree ensembles. Second, our results indicate significant improvements in predictive accuracy with ensemble size for the boosting-style algorithm. The bagging algorithms we tried showed poor performance relative to the boosting-style algorithm (but still improve upon individual base learners). Third, we show that ensembles of small trees are often able to outperform large single trees with the same number of nodes (and similarly outperform smaller ensembles of larger trees that use the same total number of nodes). This makes online boosting particularly useful in domains such as branch prediction with tight space restrictions (i.e., the available real-estate on a microprocessor chip).