StereoBrush: interactive 2D to 3D conversion using discontinuous warps

  • Authors:
  • O. Wang;M. Lang;M. Frei;A. Hornung;A. Smolic;M. Gross

  • Affiliations:
  • Disney Research Zürich;Disney Research Zürich and ETH Zürich;ETH Zürich;Disney Research Zürich;Disney Research Zürich;Disney Research Zürich and ETH Zürich

  • Venue:
  • Proceedings of the Eighth Eurographics Symposium on Sketch-Based Interfaces and Modeling
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

We introduce a novel workflow for stereoscopic 2D to 3D conversion in which the user "paints" depth onto a 2D image via sparse scribbles, instantaneously receiving intuitive 3D feedback. This workflow is enabled by the introduction of a discontinuous warping technique that creates stereoscopic pairs from sparse, possibly erroneous user input. Our method assumes a piecewise continuous depth representation, preserving visual continuity in most areas, while creating sharp depth discontinuities at important object boundaries. As opposed to prior work that relies strictly on a per pixel depth map, our scribbles are processed as soft constraints in a global solve and operate entirely on image domain disparity, allowing for relaxed input requirements. This formulation also allows us to simultaneously compute a disparity-and-content-aware stretching of background areas to automatically fill disoccluded regions with valid stereo information. We tightly integrate all steps of stereo content conversion into a single optimization framework, which can then be solved on a GPU at interactive rates. The instant feedback received while painting depth allows even untrained users to quickly create compelling 3D scenes from single-view footage.