WUW - wear Ur world: a wearable gestural interface
CHI '09 Extended Abstracts on Human Factors in Computing Systems
g-stalt: a chirocentric, spatiotemporal, and telekinetic gestural interface
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Mid-air pan-and-zoom on wall-sized displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Controller-free exploration of medical image data: Experiencing the Kinect
CBMS '11 Proceedings of the 2011 24th International Symposium on Computer-Based Medical Systems
A handle bar metaphor for virtual object manipulation with mid-air interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
This paper presents work aimed at developing and evaluating various two-handed mid-air gestures to operate a computer accurately and with little effort. The main idea driving the design of these gestures is that one hand is used for pointing, and the other hand for four standard commands: selection, drag & drop, rotation and zoom. Two chosen gesture vocabularies are compared in a user evaluation. The paper further presents a novel evaluation methodology and the application developed to evaluate the four commands first separately and then together. In our user evaluation, we found significant differences for the rotation and zooming gestures. The iconic gesture vocabulary had better performance and was better rated by the users than the technological gesture vocabulary.