Parametric ego-motion estimation for vehicle surround analysis using an omnidirectional camera

  • Authors:
  • Tarak Gandhi;Mohan Trivedi

  • Affiliations:
  • Computer Vision and Robotics Research Laboratory, University of California at San Diego, La Jolla, USA;Computer Vision and Robotics Research Laboratory, University of California at San Diego, La Jolla, USA

  • Venue:
  • Machine Vision and Applications
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

Omnidirectional cameras that give a 360掳 panoramic view of the surroundings have recently been used in many applications such as robotics, navigation, and surveillance. This paper describes the application of parametric ego-motion estimation for vehicle detection to perform surround analysis using an automobile-mounted camera. For this purpose, the parametric planar motion model is integrated with the transformations to compensate distortion in omnidirectional images. The framework is used to detect objects with independent motion or height above the road. Camera calibration as well as the approximate vehicle speed obtained from a CAN bus are integrated with the motion information from spatial and temporal gradients using a Bayesian approach. The approach is tested for various configurations of an automobile-mounted omni camera as well as a rectilinear camera. Successful detection and tracking of moving vehicles and generation of a surround map are demonstrated for application to intelligent driver support.