CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ECSCW'05 Proceedings of the ninth conference on European Conference on Computer Supported Cooperative Work
Annotating with light for remote guidance
OZCHI '07 Proceedings of the 19th Australasian conference on Computer-Human Interaction: Entertaining User Interfaces
Gestures over video streams to support remote collaboration on physical tasks
Human-Computer Interaction
KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera
Proceedings of the 24th annual ACM symposium on User interface software and technology
KinectFusion: Real-time dense surface mapping and tracking
ISMAR '11 Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality
ISMAR '11 Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality
Shake'n'sense: reducing interference for overlapping structured light depth cameras
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Reducing interference between multiple structured light depth sensors using motion
VR '12 Proceedings of the 2012 IEEE Virtual Reality
Steerable augmented reality with the beamatron
Proceedings of the 25th annual ACM symposium on User interface software and technology
OmniKinect: real-time dense volumetric data acquisition and applications
Proceedings of the 18th ACM symposium on Virtual reality software and technology
3D helping hands: a gesture based MR system for remote collaboration
Proceedings of the 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry
IllumiRoom: peripheral projected illusions for interactive experiences
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Visualization of off-surface 3D viewpoint locations in spatial augmented reality
Proceedings of the 1st symposium on Spatial user interaction
Hi-index | 0.00 |
Remote guidance systems allow humans to collaborate on physical tasks across large distances and have applications in fields such as medicine, maintenance and working with hazardous substances. Existing systems typically provide two dimensional video streams to remote participants, and these are restricted to viewpoint locations based on the placement of physical cameras. Recent systems have incorporated the ability of a remote expert to annotate their 2D view and for these annotations to be displayed in the physical workspace to the local worker. We present a prototype remote guidance system, called RemoteFusion, which is based on the volumetric fusion of commodity depth cameras. The system incorporates real-time 3D fusion with color, the ability to distinguish and render dynamic elements of a scene whether human or non-human, a multi-touch driven free 3D viewpoint, and a Spatial Augmented Reality (SAR) light annotation mechanism. We provide a physical overview of the system, including hardware and software configuration, and detail the implementation of each of the key features.