Gestures over video streams to support remote collaboration on physical tasks
Human-Computer Interaction
Integrating the physical environment into mobile remote collaboration
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
3D helping hands: a gesture based MR system for remote collaboration
Proceedings of the 11th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry
HandsIn3D: augmenting the shared 3D visual space with unmediated hand gestures
SIGGRAPH Asia 2013 Emerging Technologies
MobileHelper: remote guiding using smart mobile devices, hand gestures and augmented reality
SIGGRAPH Asia 2013 Symposium on Mobile Graphics and Interactive Applications
Hi-index | 0.00 |
Many real world scenarios involve a remote helper guiding a local worker performing manipulations of physical objects (physical tasks). Technologies and systems have been developed to support such collaborations. However, existing systems often confine collaborators in fixed desktop settings. Yet, there are many situations in which collaborators are mobile and/or desktop settings are not possible to set up. In this paper, we present HandsInAir, a real-time collaborative wearable system for remote collaboration. HandsInAir is designed to support mobility of both the worker and the helper and to provide easy access to remote expertise. In particular, this system implements a novel approach that allows helpers to perform hand gestures in the air and frees two hands of workers for object operations. We describe the system and an evaluation of it and envision future work.