International Journal of Computer Vision
Discriminant Adaptive Nearest Neighbor Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
An improved bound on the finite-sample risk of the nearest neighbor rule
Pattern Recognition Letters
Advances in Instance Selection for Instance-Based Learning Algorithms
Data Mining and Knowledge Discovery
Modeling the Shape of the Scene: A Holistic Representation of the Spatial Envelope
International Journal of Computer Vision
Kernel Nearest-Neighbor Algorithm
Neural Processing Letters
Similarity Search in High Dimensions via Hashing
VLDB '99 Proceedings of the 25th International Conference on Very Large Data Bases
Context-based vision system for place and object recognition
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Video Google: A Text Retrieval Approach to Object Matching in Videos
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Distinctive Image Features from Scale-Invariant Keypoints
International Journal of Computer Vision
A Bayesian Hierarchical Model for Learning Natural Scene Categories
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 2 - Volume 02
The Pyramid Match Kernel: Discriminative Classification with Sets of Image Features
ICCV '05 Proceedings of the Tenth IEEE International Conference on Computer Vision - Volume 2
Boosting the distance estimation
Pattern Recognition Letters
Learning Weighted Metrics to Minimize Nearest-Neighbor Classification Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
Beyond Bags of Features: Spatial Pyramid Matching for Recognizing Natural Scene Categories
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
SVM-KNN: Discriminative Nearest Neighbor Classification for Visual Category Recognition
CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
Nearest-Neighbor Methods in Learning and Vision: Theory and Practice (Neural Information Processing)
Nearest-Neighbor Methods in Learning and Vision: Theory and Practice (Neural Information Processing)
Semantic Modeling of Natural Scenes for Content-Based Image Retrieval
International Journal of Computer Vision
Boosted discriminant projections for nearest neighbor classification
Pattern Recognition
ML-KNN: A lazy learning approach to multi-label learning
Pattern Recognition
Indoor versus outdoor scene classification using probabilistic neural network
EURASIP Journal on Applied Signal Processing
The Journal of Machine Learning Research
BoostMap: An Embedding Method for Efficient Nearest Neighbor Retrieval
IEEE Transactions on Pattern Analysis and Machine Intelligence
On kernel difference-weighted k-nearest neighbor classification
Pattern Analysis & Applications - Special Issue: Non-parametric distance-based classification techniques and their applications
Boosting k-nearest neighbor classifier by means of input space projection
Expert Systems with Applications: An International Journal
Information Theory in Computer Vision and Pattern Recognition
Information Theory in Computer Vision and Pattern Recognition
Bregman Divergences and Surrogates for Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
Indoor vs. outdoor scene classification in digital photographs
Pattern Recognition
Classification Methods with Reject Option Based on Convex Risk Minimization
The Journal of Machine Learning Research
Product Quantization for Nearest Neighbor Search
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Optimal Global Nearest Neighbor Metric
IEEE Transactions on Pattern Analysis and Machine Intelligence
Leveraging k-NN for generic classification boosting
Neurocomputing
The condensed nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
Boosting nearest neighbors for the efficient estimation of posteriors
ECML PKDD'12 Proceedings of the 2012 European conference on Machine Learning and Knowledge Discovery in Databases - Volume Part I
Hi-index | 0.00 |
The k-nearest neighbors (k-NN) classification rule has proven extremely successful in countless many computer vision applications. For example, image categorization often relies on uniform voting among the nearest prototypes in the space of descriptors. In spite of its good generalization properties and its natural extension to multi-class problems, the classic k-NN rule suffers from high variance when dealing with sparse prototype datasets in high dimensions. A few techniques have been proposed in order to improve k-NN classification, which rely on either deforming the nearest neighborhood relationship by learning a distance function or modifying the input space by means of subspace selection. From the computational standpoint, many methods have been proposed for speeding up nearest neighbor retrieval, both for multidimensional vector spaces and nonvector spaces induced by computationally expensive distance measures.In this paper, we propose a novel boosting approach for generalizing the k-NN rule, by providing a new k-NN boosting algorithm, called UNN (Universal Nearest Neighbors), for the induction of leveraged k-NN. We emphasize that UNN is a formal boosting algorithm in the original boosting terminology. Our approach consists in redefining the voting rule as a strong classifier that linearly combines predictions from the k closest prototypes. Therefore, the k nearest neighbors examples act as weak classifiers and their weights, called leveraging coefficients, are learned by UNN so as to minimize a surrogate risk, which upper bounds the empirical misclassification rate over training data. These leveraging coefficients allows us to distinguish the most relevant prototypes for a given class. Indeed, UNN does not affect the k-nearest neighborhood relationship, but rather acts on top of k-NN search.We carried out experiments comparing UNN to k-NN, support vector machines (SVM) and AdaBoost on categorization of natural scenes, using state-of-the art image descriptors (Gist and Bag-of-Features) on real images from Oliva and Torralba (Int. J. Comput. Vis. 42(3):145---175, 2001), Fei-Fei and Perona (IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), pp. 524---531, 2005), and Xiao et al. (IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3485---3492, 2010). Results display the ability of UNN to compete with or beat the other contenders, while achieving comparatively small training and testing times.