Don't look at me, i'm talking to you: investigating input and output modalities for in-vehicle systems

  • Authors:
  • Lars Holm Christiansen;Nikolaj Yde Frederiksen;Brit Susan Jensen;Alex Ranch;Mikael B. Skov;Nissanthen Thiruravichandran

  • Affiliations:
  • HCI Lab, Department of Computer Science, Aalborg University, Aalborg East, Denmark;HCI Lab, Department of Computer Science, Aalborg University, Aalborg East, Denmark;HCI Lab, Department of Computer Science, Aalborg University, Aalborg East, Denmark;HCI Lab, Department of Computer Science, Aalborg University, Aalborg East, Denmark;HCI Lab, Department of Computer Science, Aalborg University, Aalborg East, Denmark;HCI Lab, Department of Computer Science, Aalborg University, Aalborg East, Denmark

  • Venue:
  • INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

With a growing number of in-vehicle systems integrated in contemporary cars, the risk of driver distraction and lack of attention on the primary task of driving is increasing. One major research area concerns eyes-off-the-road and mind-off-the-road that are manifested in different ways for input and output techniques. In this paper, we investigate in-vehicle systems input and output techniques to compare their effects on driving behavior and attention. We compare four techniques touch and gesture (input) and visual and audio (output) in a driving simulator. Our results showed that the separation of input and output is non-trivial. Gesture input resulted in significantly fewer eye glances compared to touch input, but also resulted in poorer primary driving task performance. Further, using audio as output resulted in significantly fewer eye glances, but on the other hand also longer task completion times and inferior primary driving task performance compared to visual output.