Communications of the ACM
Fundamentals of digital image processing
Fundamentals of digital image processing
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
Analysis of Class Separation and Combination of Class-Dependent Features for Handwriting Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
IEEE Transactions on Pattern Analysis and Machine Intelligence
High Confidence Visual Recognition of Persons by a Test of Statistical Independence
IEEE Transactions on Pattern Analysis and Machine Intelligence
Information, Prediction, and Query by Committee
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Boosting and other ensemble methods
Neural Computation
Hi-index | 0.00 |
Achieving good performance in biometrics requires matching the capacity of the classifier or a set of classifiers to the size of the available training set. A classifier with too many adjustable parameters (large capacity) is likely to learn the training set without difficulty but be unable to generalize properly to new patterns. If the capacity is too small, the training set might not be learned without appreciable error. There is thus advantage to control the capacity through a variety of methods involving not only the structure of the classifiers themselves, but also the property of the input space. Ths paper proposes an original non parametric method to combine optimaly multiple classifier responses. Highly favorable results have been obtained using the above method.