The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Machine learning in automated text categorization
ACM Computing Surveys (CSUR)
Negative correlation learning and evolutionary design of neural network ensembles
Negative correlation learning and evolutionary design of neural network ensembles
Face recognition: component-based versus global approaches
Computer Vision and Image Understanding - Special issue on Face recognition
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
A Theoretical and Experimental Analysis of Linear Combiners for Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Managing Diversity in Regression Ensembles
The Journal of Machine Learning Research
Ensemble learning in linearly combined classifiers via negative correlation
MCS'07 Proceedings of the 7th international conference on Multiple classifier systems
Iris recognition with support vector machines
ICB'06 Proceedings of the 2006 international conference on Advances in Biometrics
IEEE Transactions on Audio, Speech, and Language Processing
Speaker Recognition With Session Variability Normalization Based on MLLR Adaptation Transforms
IEEE Transactions on Audio, Speech, and Language Processing
Joint Factor Analysis Versus Eigenchannels in Speaker Recognition
IEEE Transactions on Audio, Speech, and Language Processing
Ensemble approaches for regression: A survey
ACM Computing Surveys (CSUR)
Hi-index | 0.00 |
We present a method for training support vector machine (SVM)-based classification systems for combination with other classification systems designed for the same task. Ideally, a new system should be designed such that, when combined with existing systems, the resulting performance is optimized. We present a simple model for this problem and use the understanding gained from this analysis to propose a method to achieve better combination performance when training SVM systems. We include a regularization term in the SVM objective function that aims to reduce the average class-conditional covariance between the resulting scores and the scores produced by the existing systems, introducing a trade-off between such covariance and the system's individual performance. That is, the new system "takes one for the team", falling somewhat short of its best possible performance in order to increase the diversity of the ensemble. We report results on the NIST 2005 and 2006 speaker recognition evaluations (SREs) for a variety of subsystems. We show a gain of 19% on the equal error rate (EER) of a combination of four systems when applying the proposed method with respect to the performance obtained when the four systems are trained independently of each other.