Multimodal interaction for distributed interactive simulation
Readings in intelligent user interfaces
MATCH: an architecture for multimodal dialogue systems
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
SmartKom: Foundations of Multimodal Dialogue Systems (Cognitive Technologies)
SmartKom: Foundations of Multimodal Dialogue Systems (Cognitive Technologies)
Multimodal interactive maps: designing for human performance
Human-Computer Interaction
Robust understanding in multimodal interfaces
Computational Linguistics
Location grounding in multimodal local search
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Multimodal framework for mobile interaction
Proceedings of the International Working Conference on Advanced Visual Interfaces
Hi-index | 0.00 |
Speak4it is a consumer-oriented mobile search application that leverages multimodal input and output to allow users to search for and act on local business information. It supports true multimodal integration where user inputs can be distributed over multiple input modes. In addition to specifying queries by voice e.g. bike repair shops near the golden gate bridge users can combine speech and gesture, for example, gas stations + will return the gas stations along the specified route traced on the display. We describe the underlying multimodal architecture and some challenges of supporting multimodal interaction as a deployed mobile service.