Naive bayes image classification: beyond nearest neighbors

  • Authors:
  • Radu Timofte;Tinne Tuytelaars;Luc Van Gool

  • Affiliations:
  • ESAT-VISICS /IBBT, Catholic University of Leuven, Belgium;ESAT-VISICS /IBBT, Catholic University of Leuven, Belgium;ESAT-VISICS /IBBT, Catholic University of Leuven, Belgium,D-ITET, ETH Zurich, Switzerland

  • Venue:
  • ACCV'12 Proceedings of the 11th Asian conference on Computer Vision - Volume Part I
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Naive Bayes Nearest Neighbor (NBNN) has been proposed as a powerful, learning-free, non-parametric approach for object classification. Its good performance is mainly due to the avoidance of a vector quantization step, and the use of image-to-class comparisons, yielding good generalization. In this paper we study the replacement of the nearest neighbor part with more elaborate and robust (sparse) representations, as well as trading performance for speed for practical purposes. The representations investigated are k-Nearest Neighbors (kNN), Iterative Nearest Neighbors (INN) solving a constrained least squares (LS) problem, Local Linear Embedding (LLE), a Sparse Representation obtained by l1-regularized LS ($SR_{l_1}$), and a Collaborative Representation obtained as the solution of a l2-regularized LS problem ($CR_{l_2}$). In particular, NIMBLE and K-DES descriptors proved viable alternatives to SIFT and, the NB$SR_{l_1}$ and NBINN classifiers provide significant improvements over NBNN, obtaining competitive results on Scene-15, Caltech-101, and PASCAL VOC 2007 datasets, while remaining learning-free approaches (i.e., no parameters need to be learned).