TractorBeam: seamless integration of local and remote pointing for tabletop displays
GI '05 Proceedings of Graphics Interface 2005
Interacting with large displays from a distance with vision-tracked multi-finger gestural input
Proceedings of the 18th annual ACM symposium on User interface software and technology
Applying reach in direct manipulation user interfaces
OZCHI '06 Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments
Direct-touch vs. mouse input for tabletop displays
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The design and evaluation of multi-finger mouse emulation techniques
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A pointing method using two accelerometers for wearable computing
Proceedings of the 2009 ACM symposium on Applied Computing
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
SurfaceMouse: supplementing multi-touch interaction with a virtual mouse
Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction
Pointable: an in-air pointing technique to manipulate out-of-reach targets on tabletops
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
HandyWidgets: local widgets pulled-out from hands
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Hi-index | 0.00 |
Reaching objects displayed on the opposite side of a large multi-touch tabletop with hands is difficult. This forces users to move around the tabletop. We present a remote pointing technique we call HandyPointing. This technique uses pull-out, a bimanual multi-touch gesture. The gesture allows users to both translate the cursor position and change control-display (C-D) ratio dynamically. We conducted one experiment to measure the quantitative performance of our technique, and another to study how users selectively use the technique and touch input (i.e., tap and drag).