RemoteFusion: real time depth camera fusion for remote collaboration on physical tasks

  • Authors:
  • Matt Adcock;Stuart Anderson;Bruce Thomas

  • Affiliations:
  • CSIRO, Canberra, Australia and University of South Australia, Mawson Lakes, Australia;University of Canberra, Canberra, Australia and CSIRO, Canberra, Australia;University of South Australia, Mawson Lakes, Australia

  • Venue:
  • Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Remote guidance systems allow humans to collaborate on physical tasks across large distances and have applications in fields such as medicine, maintenance and working with hazardous substances. Existing systems typically provide two dimensional video streams to remote participants, and these are restricted to viewpoint locations based on the placement of physical cameras. Recent systems have incorporated the ability of a remote expert to annotate their 2D view and for these annotations to be displayed in the physical workspace to the local worker. We present a prototype remote guidance system, called RemoteFusion, which is based on the volumetric fusion of commodity depth cameras. The system incorporates real-time 3D fusion with color, the ability to distinguish and render dynamic elements of a scene whether human or non-human, a multi-touch driven free 3D viewpoint, and a Spatial Augmented Reality (SAR) light annotation mechanism. We provide a physical overview of the system, including hardware and software configuration, and detail the implementation of each of the key features.