Computer-based probabilistic-network construction
Computer-based probabilistic-network construction
Machine Learning - Special issue on learning with probabilistic representations
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generating classifier outputs of fixed accuracy and diversity
Pattern Recognition Letters
UAI '89 Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence
New Measure of Classifier Dependency in Multiple Classifier Systems
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Exploiting Reliability for Dynamic Selection of Classifiers by Means of Genetic Algorithms
ICDAR '03 Proceedings of the Seventh International Conference on Document Analysis and Recognition - Volume 2
Analysis and modelling of diversity contribution to ensemble-based texture recognition performance
MCS'05 Proceedings of the 6th international conference on Multiple Classifier Systems
Studying the hybridization of artificial neural networks in HECIC
IWANN'11 Proceedings of the 11th international conference on Artificial neural networks conference on Advances in computational intelligence - Volume Part II
Hi-index | 0.00 |
Because of the lack of a clear guideline or technique for selecting classifiers which maximise diversity and accuracy, the development of techniques for analysing classifier relationships and methods for generating good constituent classifiers remains an important research direction. In this paper we propose a framework based on the Bayesian Belief Networks (BBN) approach to classification. In the proposed approach the multiple-classifier system is conceived at a meta-level and the relationships between individual classifiers are abstracted using Bayesian structural learning methods. We show that relationships revealed by the BBN structures are supported by standard correlation and diversity measures. We use the dependency properties obtained by the learned Bayesian structure to illustrate that BBNs can be used to explore classifier relationships, and for classifier selection.