A Perspective Theory for Motion and Shape Estimation in Machine Vision

  • Authors:
  • B. K. Ghosh;E. P. Loucks

  • Affiliations:
  • -;-

  • Venue:
  • SIAM Journal on Control and Optimization
  • Year:
  • 1995

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we consider the problem of motion and shape estimation of a moving body with the aid of a monocular camera. We show that the estimation problem reduces to a specific parameter estimation of a perspective dynamical system. Surprisingly, the above reduction is independent of whether the data measured is the brightness pattern which the object produces on the image plane or whether the data observed are points, lines, or curves on the image plane produced as a result of discontinuities in the brightness pattern. Many cases of the perspective parameter estimation problem have been analyzed in this paper. These cases include a fairly complete analysis of a planar textured surface undergoing a rigid flow and an affine flow. These two cases have been analyzed for orthographic, pseudo-orthographic, and image-centered projections. The basic procedure introduced for parameter estimation is to subdivide the problem into two modules, one for "spatial averaging" and the other for "time averaging." The estimation procedure is carried out with the aid of a new "realization theory for perspective systems" introduced for systems described in discrete time and in continuous time. Finally, much of our analysis has been substantiated by computer simulation of specific algorithms developed in order to explicitly compute the parameters. Detailed simulation that would answer the perspective realizability question is a subject of future research.