SearchTogether: an interface for collaborative web search
Proceedings of the 20th annual ACM symposium on User interface software and technology
An empirical investigation of multimodal interfaces for browsing internet search results
AIC'07 Proceedings of the 7th Conference on 7th WSEAS International Conference on Applied Informatics and Communications - Volume 7
A survey of collaborative web search practices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The openinterface framework: a tool for multimodal interaction.
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Algorithmic mediation for collaborative exploratory search
Proceedings of the 31st annual international ACM SIGIR conference on Research and development in information retrieval
A meta user interface to control multimodal interaction in smart environments
Proceedings of the 14th international conference on Intelligent user interfaces
Co-located collaborative web search: understanding status quo practices
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Usability framework for the design and evaluation of multimodal interaction
BCS-HCI '08 Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction - Volume 2
Search User Interfaces
www.MMRetrieval.net: a multimodal search engine
Proceedings of the Third International Conference on SImilarity Search and APplications
Emerging trends in search user interfaces
Proceedings of the 22nd ACM conference on Hypertext and hypermedia
I-SEARCH: a multimodal search engine based on rich unified content description (RUCoD)
Proceedings of the 21st international conference companion on World Wide Web
I-SEARCH: a unified framework for multimodal search and retrieval
The Future Internet
Hi-index | 0.00 |
Multimodal interaction provides the user with multiple modes of interacting with a system, such as gestures, speech, text, video, audio, etc. A multimodal system allows for several distinct means for input and output of data. In this paper, we present our work in the context of the I-SEARCH project, which aims at enabling context-aware querying of a multimodal search framework including real-world data such as user location or temperature. We introduce the concepts of MuSeBag for multimodal query interfaces, UIIFace for multimodal interaction handling, and CoFind for collaborative search as the core components behind the I-SEARCH multimodal user interface, which we evaluate via a user study.