GI '07 Proceedings of Graphics Interface 2007
Parakeet: a continuous speech recognition system for mobile touch-screen devices
Proceedings of the 14th international conference on Intelligent user interfaces
Driver Inattention Detection based on Eye Gaze-Road Event Correlation
International Journal of Robotics Research
Gazemarks: gaze-based visual placeholders to ease attention switching
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Mobile texting: can post-ASR correction solve the issues? an experimental study on gain vs. costs
Proceedings of the 2012 ACM international conference on Intelligent User Interfaces
Natural, intuitive finger based input as substitution for traditional vehicle control
Proceedings of the 3rd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Is stereoscopic 3D a better choice for information representation in the car?
Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Multimodal interaction in the car: combining speech and gestures on the steering wheel
Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
Interaction with communication and infotainment systems in the car is common while driving. Our research investigates modalities and techniques that enable interaction with interactive applications while driving without compromising safety. In this paper we present the results of an experiment where we use eye-gaze tracking in combination with a button on the steering wheel as explicit input substituting the interaction on the touch screen. This approach combines the advantages of direct interaction on visual displays without the drawbacks of touch screens. In particular the freedom of placement for the screen (even out of reach from the user) and that both hands remain on the steering wheel are the main advantages. The results show that this interaction modality is slightly slower and more distracting than a touch screen but it is significantly faster than automated speech interaction.