Integration and synchronization of input modes during multimodal human-computer interaction
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
QuickSet: multimodal interaction for distributed applications
MULTIMEDIA '97 Proceedings of the fifth ACM international conference on Multimedia
COLLAGEN: A Collaboration Manager for Software Interface Agents
User Modeling and User-Adapted Interaction
Speech and sketching for multimodal design
Proceedings of the 9th international conference on Intelligent user interfaces
Naturally conveyed explanations of device behavior
Proceedings of the 2001 workshop on Perceptive user interfaces
A study of digital ink in lecture presentation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 10th international conference on Intelligent user interfaces
MATCH: an architecture for multimodal dialogue systems
ACL '02 Proceedings of the 40th Annual Meeting on Association for Computational Linguistics
Using redundant speech and handwriting for learning new vocabulary and understanding abbreviations
Proceedings of the 8th international conference on Multimodal interfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Resolving ambiguities to create a natural computer-based sketching environment
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Combining speech and sketch to interpret unconstrained descriptions of mechanical devices
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Multimodal interaction with an autonomous forklift
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Observational study on teaching artifacts created using tablet PC
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
Sketch recognition can capture the sketching component of a multimodal conversation about design, but it does not capture information conveyed in the other modalities. The informal speech that accompanies a sketch often has a considerable amount of additional information. We want to develop a digital whiteboard capable of understanding both sketching and speech, and capable of participating in a conversation similar to one that the user would have with a human design partner. We conducted a user study to help us understand what kinds of conversations users would have with a whiteboard capable of recognizing a sketch. We report results that we believe will help guide the design of an effective multimodal interface, and discuss implications for system architectures.