Fusion of time-of-flight and stereo for disambiguation of depth measurements

  • Authors:
  • Ouk Choi;Seungkyu Lee

  • Affiliations:
  • Samsung Advanced Institute of Technology, Republic of Korea;Samsung Advanced Institute of Technology, Republic of Korea

  • Venue:
  • ACCV'12 Proceedings of the 11th Asian conference on Computer Vision - Volume Part IV
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

The complementary nature of time-of-flight and stereo has led to their fusion systems, providing high quality depth maps robustly against depth bias and random noise of the time-of-flight camera as well as the lack of scene texture. This paper shows that the fusion system is also effective for disambiguating time-of-flight depth measurements caused by phase wrapping, which records depth values that are much less than their actual values if the scene points are farther than a certain maximum range. To recover the unwrapped depth map, we build a Markov random field based on a constraint that an accurately unwrapped depth value should minimize the dissimilarity between its projections on the stereo images. The unwrapped depth map is then adapted to stereo matching, reducing the matching ambiguity and enhancing the depth quality in textureless regions. Through experiments we show that the proposed method extends the range use of the time-of-flight camera, delivering unambiguous depth maps of real scenes.