MyPlaces: detecting important settings in a visual diary

  • Authors:
  • Michael Blighe;Noel E. O'Connor

  • Affiliations:
  • Dublin City University, Dublin, Ireland;Dublin City University, Dublin, Ireland

  • Venue:
  • CIVR '08 Proceedings of the 2008 international conference on Content-based image and video retrieval
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

We describe a novel approach to identifying specific settings in large collections of passively captured images corresponding to a visual diary. An algorithm developed for setting detection should be capable of detecting images captured at the same real world locations (e.g. in the dining room at home, in front of the computer in the office, in the park, etc.). This requires the selection and implementation of suitable methods to identify visually similar backgrounds in images using their visual features. We use a Bag of Keypoints approach. This method is based on the sampling and subsequent vector quantization of multiple image patches. The image patches are sampled and described using Scale Invariant Feature Transform (SIFT) features. We compare three different classifiers, K Nearest Neighbour (KNN), Multiclass Linear Perceptron (MLP), and Support Vector Machine (SVM), and present results for classifying ten different settings across one week's worth of images. Our results demonstrate that the method produces good classification accuracy even without exploiting geometric or context based information. We also describe an early prototype of a visual diary browser that integrates the classification results.