Uncertainty Propagation in Model-Based Recognition

  • Authors:
  • T. D. Alter;David W. Jacobs

  • Affiliations:
  • MIT AI Laboratory, Room 750, 545 Technology Square, Cambridge, MA 02139. E-mail: tda@ai.mit.edu;NEC Research Institute, 4 Independence Way, Princeton, NJ 08540. E-mail: dwj@research.nj.nec.com

  • Venue:
  • International Journal of Computer Vision
  • Year:
  • 1998

Quantified Score

Hi-index 0.00

Visualization

Abstract

Robust recognition systems require a careful understanding of theeffects of error in sensed features. In model-based recognition, matchesbetween model features and sensed image features typically are used tocompute a model pose and then project the unmatched model features into theimage. The error in the image features results in uncertainty in theprojected model features. We first show how error propagates when poses arebased on three pairs of 3D model and 2D image points. In particular, we showhow to simply and efficiently compute the distributed region in the imagewhere an unmatched model point might appear, for both Gaussian and boundederror in the detection of image points, and for both scaled-orthographic andperspective projection models. Next, we provide geometric and experimentalanalyses to indicate when this linear approximation will succeed and when itwill fail. Then, based on the linear approximation, we show how we canutilize Linear Programming to compute bounded propagated error regions forany number of initial matches. Finally, we use these results to extend, fromtwo-dimensional to three-dimensional objects, robust implementations ofalignment, interpretation-tree search, and transformation clustering.