Elements of information theory
Elements of information theory
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
IEEE Transactions on Pattern Analysis and Machine Intelligence
Machine Learning
Learning from Examples with Information Theoretic Criteria
Journal of VLSI Signal Processing Systems
Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Lower and Upper Bounds for Misclassification Probability Based on Renyi's Information
Journal of VLSI Signal Processing Systems
Theoretical Bounds of Majority Voting Performance for a Binary Classification Problem
IEEE Transactions on Pattern Analysis and Machine Intelligence
Feature Extraction Using Information-Theoretic Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
An analysis of diversity measures
Machine Learning
From error probability to information theoretic (multi-modal) signal processing
Signal Processing - Special issue: Information theoretic signal processing
Moderate diversity for better cluster ensembles
Information Fusion
Information theoretic combination of classifiers with application to AdaBoost
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Application of majority voting to pattern recognition: an analysis of its behavior and performance
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Feature selection in MLPs and SVMs based on maximum output information
IEEE Transactions on Neural Networks
A probabilistic model of classifier competence for dynamic ensemble selection
Pattern Recognition
Risk function estimation for subproblems in a hierarchical classifier
Pattern Recognition Letters
Hi-index | 0.01 |
Combining several classifiers has proved to be an effective machine learning technique. Two concepts clearly influence the performances of an ensemble of classifiers: the diversity between classifiers and the individual accuracies of the classifiers. In this paper we propose an information theoretic framework to establish a link between these quantities. As they appear to be contradictory, we propose an information theoretic score (ITS) that expresses a trade-off between individual accuracy and diversity. This technique can be directly used, for example, for selecting an optimal ensemble in a pool of classifiers. We perform experiments in the context of overproduction and selection of classifiers, showing that the selection based on the ITS outperforms state-of-the-art diversity-based selection techniques.