Multimodal local search in Speak4it

  • Authors:
  • Patrick Ehlen;Michael Johnston

  • Affiliations:
  • AT&T, San Francisco, CA, USA;AT&T Labs, Florham Park, NJ, USA

  • Venue:
  • Proceedings of the 16th international conference on Intelligent user interfaces
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Speak4it is a consumer-oriented mobile search application that leverages multimodal input and output to allow users to search for and act on local business information. It supports true multimodal integration where user inputs can be distributed over multiple input modes. In addition to specifying queries by voice e.g. bike repair shops near the golden gate bridge users can combine speech and gesture, for example, gas stations + will return the gas stations along the specified route traced on the display. We describe the underlying multimodal architecture and some challenges of supporting multimodal interaction as a deployed mobile service.