Integration and synchronization of input modes during multimodal human-computer interaction
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Multimodal interaction for distributed interactive simulation
Readings in intelligent user interfaces
Ten myths of multimodal interaction
Communications of the ACM
Context-Sensitive Help for Multimodal Dialogue
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
MATCH: an architecture for multimodal dialogue systems
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
SmartKom: Foundations of Multimodal Dialogue Systems (Cognitive Technologies)
SmartKom: Foundations of Multimodal Dialogue Systems (Cognitive Technologies)
Multimodal interactive maps: designing for human performance
Human-Computer Interaction
City browser: developing a conversational automotive HMI
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Robust understanding in multimodal interfaces
Computational Linguistics
Location grounding in multimodal local search
International Conference on Multimodal Interfaces and the Workshop on Machine Learning for Multimodal Interaction
Hi-index | 0.00 |
Speak4it™ is a mobile search application that leverages multimodal input and integration to allow users to search for and act on local business information. We present an initial empirical analysis of user interaction with a multimodal local search application deployed in the field with real users. Specifically, we focus on queries involving multimodal commands, and analyze multimodal interaction behaviors seen in a deployed multimodal system.