Multimodal interaction for distributed interactive simulation
Readings in intelligent user interfaces
MATCH: an architecture for multimodal dialogue systems
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
SmartKom: Foundations of Multimodal Dialogue Systems (Cognitive Technologies)
SmartKom: Foundations of Multimodal Dialogue Systems (Cognitive Technologies)
Multimodal interactive maps: designing for human performance
Human-Computer Interaction
Robust understanding in multimodal interfaces
Computational Linguistics
Hi-index | 0.00 |
Speak4itSM is a consumer-oriented mobile search application that leverages multimodal input and output to allow users to search for and act on local business information. It supports true multimodal integration where user inputs can be distributed over multiple input modes. In addition to specifying queries by voice (e.g., "bike repair shops near the golden gate bridge") users can combine speech and gesture. For example, "gas stations" + will return the gas stations along the specified route traced on the display. We provide interactive demonstrations of Speak4it on both the iPhone and iPad platforms and explain the underlying multimodal architecture and challenges of supporting multimodal interaction as a deployed mobile service.