Mean Shift: A Robust Approach Toward Feature Space Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
VR '00 Proceedings of the IEEE Virtual Reality 2000 Conference
Gaze typing compared with input by head and hand
Proceedings of the 2004 symposium on Eye tracking research & applications
TouchLight: an imaging touch screen and display for gesture-based interaction
Proceedings of the 6th international conference on Multimodal interfaces
Distant freehand pointing and clicking on very large, high resolution displays
Proceedings of the 18th annual ACM symposium on User interface software and technology
Hover widgets: using the tracking state to extend the capabilities of pen-operated devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Keepin' it real: pushing the desktop metaphor with physics, piles and the pen
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The design and evaluation of selection techniques for 3D volumetric displays
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Shadow tracking on multi-touch tables
AVI '08 Proceedings of the working conference on Advanced visual interfaces
Bringing physics to the surface
Proceedings of the 21st annual ACM symposium on User interface software and technology
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Interactions in the air: adding further depth to interactive tabletops
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Sticky tools: full 6DOF force-based interaction for multi-touch tables
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Investigating multi-touch and pen gestures for diagram editing on interactive surfaces
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Combining multiple depth cameras and projectors for interactions on, above and between surfaces
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Free-hand gestures for music playback: deriving gestures with a user-centred process
Proceedings of the 9th International Conference on Mobile and Ubiquitous Multimedia
Z-touch: an infrastructure for 3d gesture interaction in the proximity of tabletop surfaces
ACM International Conference on Interactive Tabletops and Surfaces
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part III
Pointable: an in-air pointing technique to manipulate out-of-reach targets on tabletops
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Extending interactions into hoverspace using reflected light
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Going beyond the surface: studying multi-layer interaction above the tabletop
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A study on touch & hover based interaction for zooming
CHI '12 Extended Abstracts on Human Factors in Computing Systems
A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface
Proceedings of the 12th International Conference on Interaction Design and Children
Proceedings of the 1st symposium on Spatial user interaction
MobiZone: personalized interaction with multiple items on interactive surfaces
Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia
Hi-index | 0.00 |
Many new technologies are emerging that make it possible to extend interaction into the three-dimensional space directly above or in front of a multitouch surface. Such techniques allow people to control these devices by performing hand gestures in the air. In this paper, we present a method of extending interactions into the space above a multitouch surface using only a standard diffused surface illumination (DSI) device, without any additional sensors. Then we focus on interaction techniques for activating graphical widgets located in this above-surface space. We have conducted a study to elicit gestures for above-table widget activation. A follow-up study was conducted to evaluate and compare these gestures based on their performance. Our results showed that there was no clear agreement on what gestures should be used to select objects in mid-air, and that performance was better when using gestures that were chosen less frequently, but predicted to be better by the designers, as opposed to those most frequently suggested by participants.