Depth from stationary blur with adaptive filtering

  • Authors:
  • Jiang Yu Zheng;Min Shi

  • Affiliations:
  • Department of Computer Science, Indiana University Purdue University Indianapolis;Department of Computer Science, Indiana University Purdue University Indianapolis

  • Venue:
  • ACCV'07 Proceedings of the 8th Asian conference on Computer vision - Volume Part II
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work achieves an efficient acquisition of scenes and their depths along long streets. A camera is mounted on a vehicle moving along a path and a sampling line properly set in the camera frame scans the 1D scene continuously to form a 2D route panorama. This paper extends a method to estimate depth from the camera path by analyzing the stationary blur in the route panorama. The temporal stationary blur is a perspective effect in parallel projection yielded from the sampling slit with a physical width. The degree of blur is related to the scene depth from the camera path. This paper analyzes the behavior of the stationary blur with respect to camera parameters and uses adaptive filtering to improve the depth estimation. It avoids feature matching or tracking for complex street scenes and facilitates real time sensing. The method also stores much less data than a structure from motion approach does so that it can extend the sensing area significantly.