An empirical comparison of pie vs. linear menus
CHI '88 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 12th annual ACM symposium on User interface software and technology
Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Active click: tactile feedback for touch panels
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Interacting with Large Displays
Computer
Proceedings of the 2nd international conference on Tangible and embedded interaction
You can touch, but you can't look: interacting with in-vehicle systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hands-on the process control: users preferences and associations on hand movements
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Designing Gestural Interfaces: Touchscreens and Interactive Devices
Designing Gestural Interfaces: Touchscreens and Interactive Devices
pieTouch: a direct touch gesture interface for interacting with in-vehicle information systems
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
TeslaTouch: electrovibration for touch surfaces
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Leveraging proprioception to make mobile phones more accessible to users with visual impairments
Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility
Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Visual cues supporting direct touch gesture interaction with in-vehicle information systems
Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Gestural interaction on the steering wheel: reducing the visual demand
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparing direct and remote tactile feedback on interactive surfaces
EuroHaptics'12 Proceedings of the 2012 international conference on Haptics: perception, devices, mobility, and communication - Volume Part I
PocketMenu: non-visual menus for touch screen devices
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
Hi-index | 0.00 |
Large touch screens are recently appearing in the automotive market, yet their usability while driving is still controversial. Flat screens do not provide haptic guidance and thus require visual attention to locate interactive elements that are displayed. Thus, we need to think about new concepts to minimize the visual attention needed for interaction, to keep the driver's focus on the road and ensure safety. In this paper, we explore three different approaches. The first one is designed to make use of proprioception. The second approach incorporates physical handles to ease orientation on a large flat surface. In the third approach, directional touch gestures are applied. We describe the results of a comparative study that investigates the required visual attention as well as task performance and perceived usability, in comparison to a state-of-the-art multifunctional controller. We found that direct touch buttons provide the best results regarding task completion time, but with a size of about 6x8 cm, they were not yet large enough for blind interaction. Physical elements in and around the screen space were regarded useful to ease orientation. With touch gestures, participants were able to reduce visual attention to a lower level than with the remote controller. Considering our findings, we argue that there are ways to make large screens more appropriate for in-car usage and thus harness the advantages they provide in other aspects.