A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
SVM-KNN: Discriminative Nearest Neighbor Classification for Visual Category Recognition
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Keypoint Recognition Using Randomized Trees
IEEE Transactions on Pattern Analysis and Machine Intelligence
International Journal of Computer Vision
Hi-index | 0.00 |
In this paper, for every local feature, we propose to learn its similar local features across all positive images, instead of using heuristic distance as similarity measure. Specifically, Multiple Instance Learning (MIL) is employed to simultaneously determine the similar points of a local feature and learn its corresponding discriminative function which can be regarded as some kind of similarity measure. For each local feature, a weak learner is constructed based on such similarity measure. Then AdaBoost selects the most discriminative local features and combines them to form a strong classifier. Experimental results show encouraging performance of our method.