Speaker identification and verification using Gaussian mixture speaker models
Speech Communication
Boosting a weak learning algorithm by majority
Information and Computation
Machine Learning
The Random Subspace Method for Constructing Decision Forests
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving the Performance of Boosting for Naive Bayesian Classification
PAKDD '99 Proceedings of the Third Pacific-Asia Conference on Methodologies for Knowledge Discovery and Data Mining
The Journal of Machine Learning Research
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Gaussian Mixture Models for on-line signature verification
WBMA '03 Proceedings of the 2003 ACM SIGMM workshop on Biometrics methods and applications
Local and Global Feature Selection for On-line Signature Verification
ICDAR '05 Proceedings of the Eighth International Conference on Document Analysis and Recognition
Handbook of Multibiometrics (International Series on Biometrics)
Handbook of Multibiometrics (International Series on Biometrics)
Landscapes of Naïve Bayes classifiers
Pattern Analysis & Applications
Information theoretic combination of classifiers with application to AdaBoost
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Naïve Bayes ensembles with a random oracle
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
An on-line signature verification system based on fusion of local and global information
AVBPA'05 Proceedings of the 5th international conference on Audio- and Video-Based Biometric Person Authentication
On Using the Viterbi Path Along With HMM Likelihood Information for Online Signature Verification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
ICB '09 Proceedings of the Third International Conference on Advances in Biometrics
Hi-index | 0.00 |
Classifiers based on Gaussian mixture models are good performers in many pattern recognition tasks. Unlike decision trees, they can be described as stable classifier: a small change in the sampling of the training set will produce not a large change in the parameters of the trained classifier. Given that ensembling techniques often rely on instability of the base classifiers to produce diverse ensembles, thereby reaching better performance than individual classifiers, how can we form ensembles of Gaussian mixture models? This paper proposes methods to optimise coverage in ensembles of Gaussian mixture classifiers by promoting diversity amongst these stable base classifiers. We show that changes in the signal processing chain and modelling parameters can lead to significant complementarity between classifiers, even if trained on the same source signal. We illustrate the approach by applying it to a signature verification problem, and show that very good results are obtained, as verified in the large-scale international evaluation campaign BMEC 2007.