Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Bringing physics to the surface
Proceedings of the 21st annual ACM symposium on User interface software and technology
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A screen-space formulation for 2D and 3D direct manipulation
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Simulating grasping behavior on an imaging interactive surface
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Sticky tools: full 6DOF force-based interaction for multi-touch tables
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Cultural similarities and differences in user-defined gestures for touchscreen user interfaces
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Understanding users' preferences for surface gestures
Proceedings of Graphics Interface 2010
User-defined gestures for connecting mobile phones, public displays, and tabletops
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
User-defined motion gestures for mobile interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
tBox: a 3d transformation widget designed for touch-screens
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Studying user-defined iPad gestures for interaction in multi-display environment
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
Understanding user gestures for manipulating 3D objects from touchscreen inputs
Proceedings of Graphics Interface 2012
Hi-index | 0.00 |
Although multi-touch interaction in 2D has become widespread on mobile devices, intuitive ways to interact with 3D objects has not been thoroughly explored. We present a study on natural and guided multi-touch interaction with 3D objects on a 2D multi-touch display. Specifically, we focus on interactions with 3D objects that have either rotational, tightening, or switching components on mechanisms that might be found in mechanical operation or training simulations. The results of our study led to the following contributions: a classification procedure for determining the category and nature of a gesture, an initial user-defined gesture set for multi-touch gestures applied to 3D objects, and user preferences with regards to metaphorical versus physical gestures.