Fluid DTMouse: better mouse support for touch-based interactions
Proceedings of the working conference on Advanced visual interfaces
Investigating the effectiveness of tactile feedback for mobile touchscreens
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The design and evaluation of multi-finger mouse emulation techniques
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Sketching User Experiences: Getting the Design Right and the Right Design
Sketching User Experiences: Getting the Design Right and the Right Design
TeslaTouch: electrovibration for touch surfaces
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Touch input on curved surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Design of unimanual multi-finger pie menu interaction
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Input finger detection for nonvisual touch screen text entry in Perkinput
Proceedings of Graphics Interface 2012
PocketMenu: non-visual menus for touch screen devices
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Empowering materiality: inspiring the design of tangible interactions
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
An interaction vocabulary. describing the how of interaction.
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
Touch screens are on the rise and replace traditional knobs and buttons at a fast pace. However, their lack of tangible guidance and feedback can become a problem in scenarios where visual attention is scarce. Besides dynamic tactile feedback by vibrations, the usability of touch screens can be improved by static haptic structures such as shaped or structured surfaces. In this paper we describe the prototype of an in-vehicle application using unimanual four-finger interaction and haptic guidance in order to avoid visual distraction from the primary task of driving. We built a low fidelity prototype with static haptics using an Android tablet and silicone foil. A user study showed that flexible positioning of touch buttons mapped to the user's fingers was more convenient and produced fewer errors than fixed positioning. A curved haptic border provided the user with orientation and allowed a new selection mode: dragging buttons over the edge resulted in a reduced interaction time when compared to double tapping. We present several different variants for unimanual multifinger interaction on planar and non-planar surfaces. Our results can support the development of future concepts for blind interaction.