Knowing where and when to look in a time-critical multimodal dual task

  • Authors:
  • Anthony J. Hornof;Yunfeng Zhang;Tim Halverson

  • Affiliations:
  • University of Oregon, Eugene, OR, USA;University of Oregon, Eugene, OR, USA;University of Oregon, Eugene, OR, USA

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.01

Visualization

Abstract

Human-computer systems intended for time-critical multitasking need to be designed with an understanding of how humans can coordinate and interleave perceptual, memory, and motor processes. This paper presents human performance data for a highly-practiced time-critical dual task. In the first of the two interleaved tasks, participants tracked a target with a joystick. In the second, participants keyed-in responses to objects moving across a radar display. Task manipulations include the peripheral visibility of the secondary display (visible or not) and the presence or absence of auditory cues to assist with the radar task. Eye movement analyses reveal extensive coordination and overlapping of human information processes and the extent to which task manipulations helped or hindered dual task performance. For example, auditory cues helped only a little when the secondary display was peripherally visible, but they helped a lot when it was not peripherally visible.