Eyes-free interaction with free-hand gestures and auditory menus

  • Authors:
  • Raine Kajastila;Tapio Lokki

  • Affiliations:
  • Aalto University School of Science, Department of Media Technology, P.O. Box 15400, FI-00076 Aalto, Finland;Aalto University School of Science, Department of Media Technology, P.O. Box 15400, FI-00076 Aalto, Finland

  • Venue:
  • International Journal of Human-Computer Studies
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Auditory interfaces can overcome visual interfaces when a primary task, such as driving, competes for the attention of a user controlling a device, such as radio. In emerging interfaces enabled by camera tracking, auditory displays may also provide viable alternatives to visual displays. This paper presents a user study of interoperable auditory and visual menus, in which control gestures remain the same in the visual and the auditory domain. Tested control methods included a novel free-hand gesture interaction with camera-based tracking, and touch screen interaction with a tablet. The task of the participants was to select numbers from a visual or an auditory menu including a circular layout and a numeric keypad layout. Results show, that even with participant's full attention to the task, the performance and accuracy of the auditory interface are the same or even slightly better than the visual when controlled with free-hand gestures. The auditory menu was measured to be slower in touch screen interaction, but questionnaire revealed that over half of the participants felt that the circular auditory menu was faster than the visual menu. Furthermore, visual and auditory feedback in touch screen interaction with numeric layout was measured fastest, touch screen with circular menu second fastest, and the free-hand gesture interface was slowest. The results suggest that auditory menus can potentially provide a fast and desirable interface to control devices with free-hand gestures.