Biases and interaction effects in gestural acquisition of auditory targets using a hand-held device

  • Authors:
  • Lonce Wyse;Suranga Nanayakkara;Norikazu Mitani

  • Affiliations:
  • National University of Singapore;Singapore University of Technology and Design;National University of Singapore

  • Venue:
  • Proceedings of the 23rd Australian Computer-Human Interaction Conference
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

A user study explored bias and interaction effects in an auditory target tracking task using a hand-held gestural interface device for musical sound. Participants manipulated the physical dimensions of pitch, roll, and yaw of a hand-held device, which were mapped to the sound dimensions of musical pitch, timbre, and event density. Participants were first presented with a sound, which they then had to imitate as closely as possible by positioning the hand-held controller. Accuracy and time-to-target were influenced by specific sounds as well as pairings between controllers and sounds. Some bias effects in gestural dimensions independent of sound mappings were also found.