Maximizing the guessability of symbolic input
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Gesture Registration, Relaxation, and Reuse for Multi-Point Direct-Touch Surfaces
TABLETOP '06 Proceedings of the First IEEE International Workshop on Horizontal Interactive Human-Computer Systems
A study of hand shape use in tabletop gesture interaction
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Shallow-depth 3d interaction: design and evaluation of one-, two- and three-touch techniques
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hands-on the process control: users preferences and associations on hand movements
CHI '08 Extended Abstracts on Human Factors in Computing Systems
MicroRolls: expanding touch-screen input vocabulary by distinguishing rolls vs. slides of the thumb
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Understanding Multi-touch Manipulation for Surface Computing
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
A screen-space formulation for 2D and 3D direct manipulation
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Sticky tools: full 6DOF force-based interaction for multi-touch tables
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
The effect of DOF separation in 3D manipulation tasks with multi-touch displays
Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology
Eden: a professional multitouch tool for constructing virtual organic environments
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
tBox: a 3d transformation widget designed for touch-screens
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The design and evaluation of 3D positioning techniques for multi-touch displays
3DUI '10 Proceedings of the 2010 IEEE Symposium on 3D User Interfaces
A virtual camera controlling method using multi-touch gestures for capturing free-viewpoint video
Proceedings of the 11th european conference on Interactive TV and video
Memorability of pre-designed and user-defined gesture sets
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A Wizard-of-Oz elicitation study examining child-defined gestures with a whole-body interface
Proceedings of the 12th International Conference on Interaction Design and Children
Towards user-defined multi-touch gestures for 3D objects
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
Hi-index | 0.00 |
Multi-touch interfaces have emerged with the widespread use of smartphones. Although a lot of people interact with 2D applications through touchscreens, interaction with 3D applications remains little explored. Most of 3D object manipulation techniques have been created by designers and users are generally put aside from the design creation process. We conducted a user study to better understand how non-technical users interact with a 3D object from touchscreen inputs. The experiment has been conducted while users manipulated a 3D cube with three points of view for rotations, scaling and translations (RST). Sixteen users participated and 432 gestures were analyzed. To classify data, we introduce a taxonomy for 3D manipulation gestures with touchscreens. Then, we identify a set of strategies employed by users to realize the proposed cube transformations. Our findings suggest that each participant uses several strategies with a predominant one. Finally, we propose some guidelines to help designers in the creation of more user friendly tools.