A multimodal dialogue interface for mobile local search

  • Authors:
  • Patrick Ehlen;Michael Johnston

  • Affiliations:
  • AT&T, San Francisco, California, USA;AT&T Labs Research, Florham Park, USA

  • Venue:
  • Proceedings of the companion publication of the 2013 international conference on Intelligent user interfaces companion
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Speak4itSM uses a multimodal interface to perform mobile search for local businesses. Users combine simultaneous speech and touch to input queries or commands, for example, by saying, "gas stations", while tracing a route on a touchscreen. This demonstration will exhibit an extension of our multimodal semantic processing architecture from a one-shot query system to a multimodal dialogue system that tracks dialogue state over multiple turns and resolves prior context using unification-based context resolution. We illustrate the capabilities and limitations of this approach to multimodal interpretation, describing the challenges of supporting true multimodal interaction in a deployed mobile service, while offering an interactive demonstration on tablets and smartphones.