Motion does matter: an examination of speech-based text entry on the move

  • Authors:
  • J. Price;Min Lin;Jinjuan Feng;Rich Goldman;Andrew Sears;A. Jacko

  • Affiliations:
  • UMBC, Information Systems Department, Interactive Systems Research Center, 1000 Hilltop Circle, 21250, Baltimore, MD, USA;UMBC, Information Systems Department, Interactive Systems Research Center, 1000 Hilltop Circle, 21250, Baltimore, MD, USA;UMBC, Information Systems Department, Interactive Systems Research Center, 1000 Hilltop Circle, 21250, Baltimore, MD, USA;UMBC, Information Systems Department, Interactive Systems Research Center, 1000 Hilltop Circle, 21250, Baltimore, MD, USA;UMBC, Information Systems Department, Interactive Systems Research Center, 1000 Hilltop Circle, 21250, Baltimore, MD, USA;Georgia Institute of Technology, School of Industrial& Systems Engineering, 765 Ferst Drive, 30332-0205, Atlanta, GA, USA

  • Venue:
  • Universal Access in the Information Society
  • Year:
  • 2006

Quantified Score

Hi-index 0.01

Visualization

Abstract

Desktop interaction solutions are often inappropriate for mobile devices due to small screen size and portability needs. Speech recognition can improve interactions by providing a relatively hands-free solution that can be used in various situations. While mobile systems are designed to be transportable, few have examined the effects of motion on mobile interactions. This paper investigates the effect of motion on automatic speech recognition (ASR) input for mobile devices. Speech recognition error rates (RER) have been examined with subjects walking or seated, while performing text input tasks and the effect of ASR enrollment conditions on RER. The obtained results suggest changes in user training of ASR systems for mobile and seated usage.