A handle bar metaphor for virtual object manipulation with mid-air interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Around device interaction for multiscale navigation
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Gaze and gesture based object manipulation in virtual worlds
Proceedings of the 18th ACM symposium on Virtual reality software and technology
Real-Time markerless hand gesture recognition with depth camera
PCM'12 Proceedings of the 13th Pacific-Rim conference on Advances in Multimedia Information Processing
BeThere: 3D mobile collaboration with spatial input
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Free-hands interaction in augmented reality
Proceedings of the 1st symposium on Spatial user interaction
Dynamics based 3D skeletal hand tracking
Proceedings of Graphics Interface 2013
Real-time Hand Gesture Recognition from Depth Images Using Convex Shape Decomposition Method
Journal of Signal Processing Systems
Hi-index | 0.00 |
We present a novel technique implementing barehanded interaction with virtual 3D content by employing a time-of-flight camera. The system improves on existing 3D multi-touch systems by working regardless of lighting conditions and supplying a working volume large enough for multiple users. Previous systems were limited either by environmental requirements, working volume, or computational resources necessary for realtime operation. By employing a time-of-flight camera, the system is capable of reliably recognizing gestures at the finger level in real-time at more than 50 fps with commodity computer hardware using our newly developed precision hand and finger-tracking algorithm. Building on this algorithm, the system performs gesture recognition with simple constraint modeling over statistical aggregations of the hand appearances in a working volume of more than 8 cubic meters. Two iterations of user tests were performed on a prototype system, demonstrating the feasibility and usability of the approach as well as providing first insights regarding the acceptance of true barehanded touch-based 3D interaction.