DiamondSpin: an extensible toolkit for around-the-table interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Rotation and Translation Mechanisms for Tabletop Interaction
TABLETOP '06 Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems
Precise selection techniques for multi-touch screens
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
G-nome surfer: a tabletop interface for collaborative exploration of genomic data
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Collaborative brushing and linking for co-located visual analytics of document collections
EuroVis'09 Proceedings of the 11th Eurographics / IEEE - VGTC conference on Visualization
Hi-index | 0.00 |
In tabletop computing it is crucial to instantiate objects, such as documents or virtual containers, in an ergonomically convenient way for users. Particularly, objects need to be positioned within reach of users, need to be orientated properly, and need to be scaled appropriately for convenient interaction by touch. As the user's location at the device is usually unknown to the system, objects are typically spawned at a default position and with a default orientation and size in tabletop user interfaces. Thus, users typically need to manipulate objects after instantiation until they are properly aligned and scaled, which can be a cumbersome and time-consuming process. We designed two gesture-based interaction techniques to instantiate objects with a convenient orientation, size, and position, making further adjustments to these properties unnecessary. We describe the functionality of both techniques and discuss insights gathered during initial evaluations.