A theory of multiple classifier systems and its application to visual word recognition
A theory of multiple classifier systems and its application to visual word recognition
Face Recognition by Elastic Bunch Graph Matching
IEEE Transactions on Pattern Analysis and Machine Intelligence
The FERET Evaluation Methodology for Face-Recognition Algorithms
IEEE Transactions on Pattern Analysis and Machine Intelligence
Rank aggregation methods for the Web
Proceedings of the 10th international conference on World Wide Web
Introduction to Modern Information Retrieval
Introduction to Modern Information Retrieval
Proceedings of the First International Workshop on Multiple Classifier Systems
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
An introduction to boosting and leveraging
Advanced lectures on machine learning
Web metasearch: rank vs. score based rank aggregation methods
Proceedings of the 2003 ACM symposium on Applied computing
An efficient boosting algorithm for combining preferences
The Journal of Machine Learning Research
Mixed group ranks: preference and confidence in classifier combination
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
Rankboost has been shown to be an effective algorithm for combining ranks. However, its ability to generalize well and not overfit is directly related to the choice of weak learner, in the sense that regularization of the rank function is due to the regularization properties of its weak learners. We present a regularization property called consistency in preference and confidence that mathematically translates into monotonic concavity, and describe a new weak ranking learner (MWGR) that generates ranking functions with this property. In experiments combining ranks from multiple face recognition algorithms and an experiment combining text information retrieval systems, rank functions using MWGR proved superior to binary weak learners.