Bayesian Object Localisation in Images

  • Authors:
  • J. Sullivan;A. Blake;M. Isard;J. MacCormick

  • Affiliations:
  • Department of Engineering Science, University of Oxford, Parks Road Oxford OX1 3PJ, UK. http://www.robots.ox.ac.uk/~vdg/;Department of Engineering Science, University of Oxford, Parks Road Oxford OX1 3PJ, UK. http://www.robots.ox.ac.uk/~vdg/;Department of Engineering Science, University of Oxford, Parks Road Oxford OX1 3PJ, UK. http://www.robots.ox.ac.uk/~vdg/;Department of Engineering Science, University of Oxford, Parks Road Oxford OX1 3PJ, UK. http://www.robots.ox.ac.uk/~vdg/

  • Venue:
  • International Journal of Computer Vision
  • Year:
  • 2001

Quantified Score

Hi-index 0.00

Visualization

Abstract

A Bayesian approach to intensity-based object localisation is presented that employs a learned probabilistic model of image filter-bank output, applied via Monte Carlo methods, to escape the inefficiency of exhaustive search.An adequate probabilistic account of image data requires intensities both in the foreground (i.e. over the object), and in the background, to be modelled. Some previous approaches to object localisation by Monte Carlo methods have used models which, we claim, do not fully address the issue of the statistical independence of image intensities. It is addressed here by applying to each image a bank of filters whose outputs are approximately statistically independent. Distributions of the responses of individual filters, over foreground and background, are learned from training data. These distributions are then used to define a joint distribution for the output of the filter bank, conditioned on object configuration, and this serves as an observation likelihood for use in probabilistic inference about localisation.The effectiveness of probabilistic object localisation in image clutter, using Bayesian Localisation, is illustrated. Because it is a Monte Carlo method, it produces not simply a single estimate of object configuration, but an entire sample from the posterior distribution for the configuration. This makes sequential inference of configuration possible. Two examples are illustrated here: coarse to fine scale inference, and propagation of configuration estimates over time, in image sequences.