A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Multiclass learning, boosting, and error-correcting codes
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Complexity Measures of Supervised Classification Problems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Using output codes to boost multiclass learning problems
ICML '97 Proceedings of the Fourteenth International Conference on Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Multiclass boosting with repartitioning
ICML '06 Proceedings of the 23rd international conference on Machine learning
Unifying multi-class AdaBoost algorithms with binary base learners under the margin framework
Pattern Recognition Letters
Margin Trees for High-dimensional Classification
The Journal of Machine Learning Research
Solving multiclass learning problems via error-correcting output codes
Journal of Artificial Intelligence Research
Multi-information ensemble diversity
MCS'10 Proceedings of the 9th international conference on Multiple Classifier Systems
IJCAI'11 Proceedings of the Twenty-Second international joint conference on Artificial Intelligence - Volume Volume Two
Feature weighting by RELIEF based on local hyperplane approximation
PAKDD'12 Proceedings of the 16th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
Diversity regularized ensemble pruning
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Hi-index | 0.00 |
This paper examines the benefits that information theory can bring to the study of multiple classifier systems. We discuss relationships between the mutual information and the classification error of a predictor. We proceed to discuss how this concerns ensemble systems, by showing a natural expansion of the ensemble mutual information into "accuracy" and "diversity" components. This natural derivation of a diversity term is an alternative to previous attempts to artificially define a term. The main finding is that diversity in fact exists at multiple orders of correlation, and pairwise diversity can capture only the low order components.