C4.5: programs for machine learning
C4.5: programs for machine learning
The weighted majority algorithm
Information and Computation
Learning in the presence of concept drift and hidden contexts
Machine Learning
Decision Tree Induction Based on Efficient Tree Restructuring
Machine Learning
On the Optimality of the Simple Bayesian Classifier under Zero-One Loss
Machine Learning - Special issue on learning with probabilistic representations
Machine Learning - Special issue on context sensitivity and concept drift
On-line learning in neural networks
On-line learning in neural networks
The application of AdaBoost for distributed, scalable and on-line learning
KDD '99 Proceedings of the fifth ACM SIGKDD international conference on Knowledge discovery and data mining
Large Margin Classification Using the Perceptron Algorithm
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
On Comparing Classifiers: Pitfalls toAvoid and a Recommended Approach
Data Mining and Knowledge Discovery
Online Ensemble Learning: An Empirical Study
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Ensemble Methods in Machine Learning
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Not So Naive Bayes: Aggregating One-Dependence Estimators
Machine Learning
Computers and Operations Research
Incremental training of support vector machines using hyperspheres
Pattern Recognition Letters
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Machine learning: a review of classification and combining techniques
Artificial Intelligence Review
Incremental construction of classifier and discriminant ensembles
Information Sciences: an International Journal
Computational Statistics & Data Analysis
Troika - An improved stacking schema for classification tasks
Information Sciences: an International Journal
Artificial Intelligence Review
Hi-index | 0.00 |
Along with the increase of data and information, incremental learning ability turns out to be more and more important for machine learning approaches. The online algorithms try not to remember irrelevant information instead of synthesizing all available information (as opposed to classic batch learning algorithms). Today, combining classifiers is proposed as a new road for the improvement of the classification accuracy. However, most ensemble algorithms operate in batch mode. For this reason, we propose an incremental ensemble that combines five classifiers that can operate incrementally: the Naive Bayes, the Averaged One-Dependence Estimators (AODE), the 3-Nearest Neighbors, the Non-Nested Generalised Exemplars (NNGE) and the Kstar algorithms using the voting methodology. We performed a large-scale comparison of the proposed ensemble with other state-of-the-art algorithms on several datasets and the proposed method produce better accuracy in most cases.