Boosting k-NN for Categorization of Natural Scenes

  • Authors:
  • Richard Nock;Paolo Piro;Frank Nielsen;Wafa Bel Haj Ali;Michel Barlaud

  • Affiliations:
  • CEREGMIA, Université Antilles-Guyane, Martinique, France;Istituto Italiano di Tecnologia, Genoa, Italy 16163;Sony Computer Science Laboratories, Inc., Tokyo, Japan and LIX Department, Ecole Polytechnique, Palaiseau, France;University of Nice-Sophia Antipolis/CNRS, Sophia Antipolis, France 06903;University of Nice-Sophia Antipolis/CNRS, Sophia Antipolis, France 06903

  • Venue:
  • International Journal of Computer Vision
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

The k-nearest neighbors (k-NN) classification rule has proven extremely successful in countless many computer vision applications. For example, image categorization often relies on uniform voting among the nearest prototypes in the space of descriptors. In spite of its good generalization properties and its natural extension to multi-class problems, the classic k-NN rule suffers from high variance when dealing with sparse prototype datasets in high dimensions. A few techniques have been proposed in order to improve k-NN classification, which rely on either deforming the nearest neighborhood relationship by learning a distance function or modifying the input space by means of subspace selection. From the computational standpoint, many methods have been proposed for speeding up nearest neighbor retrieval, both for multidimensional vector spaces and nonvector spaces induced by computationally expensive distance measures.In this paper, we propose a novel boosting approach for generalizing the k-NN rule, by providing a new k-NN boosting algorithm, called UNN (Universal Nearest Neighbors), for the induction of leveraged k-NN. We emphasize that UNN is a formal boosting algorithm in the original boosting terminology. Our approach consists in redefining the voting rule as a strong classifier that linearly combines predictions from the k closest prototypes. Therefore, the k nearest neighbors examples act as weak classifiers and their weights, called leveraging coefficients, are learned by UNN so as to minimize a surrogate risk, which upper bounds the empirical misclassification rate over training data. These leveraging coefficients allows us to distinguish the most relevant prototypes for a given class. Indeed, UNN does not affect the k-nearest neighborhood relationship, but rather acts on top of k-NN search.We carried out experiments comparing UNN to k-NN, support vector machines (SVM) and AdaBoost on categorization of natural scenes, using state-of-the art image descriptors (Gist and Bag-of-Features) on real images from Oliva and Torralba (Int. J. Comput. Vis. 42(3):145---175, 2001), Fei-Fei and Perona (IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), pp. 524---531, 2005), and Xiao et al. (IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3485---3492, 2010). Results display the ability of UNN to compete with or beat the other contenders, while achieving comparatively small training and testing times.