Indirect mappings of multi-touch input using one and two hands
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparing multi-touch interaction techniques for manipulation of an abstract parameter space
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Finger-based multitouch interface for performing 3D CAD operations
International Journal of Human-Computer Studies
Multi-tap sliders: advancing touch interaction for parameter adjustment
Proceedings of the 2013 international conference on Intelligent user interfaces
Hi-index | 0.00 |
Recent advances in touch sensing technologies have made it possible to interact with computers in a device-free manner, allowing for arguably more natural and intuitive input using multiple hands and fingers. Unfortunately, existing multi-point touch-sensitive devices have a number of sensor limitations which restrict the types of manipulations that can be performed. Additionally, while many well-studied techniques from the bimanual interaction literature are applicable to these emerging multi-point devices, there remain many unanswered questions as to how multiple fingers from a single hand can best be utilized on these touch-sensitive surfaces. This dissertation attempts to address some of these open issues. We first develop the Visual Touchpad, a low-cost vision-based input device that allows for detecting multiple hands and fingertips over a constrained planar surface. Unlike existing multi-point devices, the Visual Touchpad extracts a reliable 2D image of the entire hand that can be used to extract more detailed information about the fingers such as labels, orientation, and hover. We then design and implement three systems that leverage the capabilities of the Visual Touchpad to explore how multiple fingers could be used in real-world interface scenarios. Next we propose and experimentally validate a fluid interaction style that uses the thumb and index finger of a single hand in an asymmetric-dependent manner to control bi-digit widgets, where the index finger performs the primary and more frequent 2D tasks and the thumb performs secondary and less frequent tasks to support the index finger's manipulations. We then investigate the impact of visual feedback on the perception of finger span when using bi-digit widgets to merge command selection and direct manipulation. Results suggest that users are capable of selecting from up to 4 discrete commands with the thumb without any visual feedback, which allows us to design a set of more advanced bidigit widgets that facilitate smooth transitioning from novice to expert usage.