A design space for multimodal systems: concurrent processing and data fusion
INTERCHI '93 Proceedings of the INTERCHI '93 conference on Human factors in computing systems
Using handhelds and PCs together
Communications of the ACM
Model-Based Design and Evaluation of Interactive Applications
Model-Based Design and Evaluation of Interactive Applications
CTTE: support for developing and analyzing task models for interactive system design
IEEE Transactions on Software Engineering
ConcurTaskTrees: A Diagrammatic Notation for Specifying Task Models
INTERACT '97 Proceedings of the IFIP TC13 Interantional Conference on Human-Computer Interaction
ICrafter: A Service Framework for Ubiquitous Computing Environments
UbiComp '01 Proceedings of the 3rd international conference on Ubiquitous Computing
QuickSet: multimodal interaction for simulation set-up and control
ANLC '97 Proceedings of the fifth conference on Applied natural language processing
SNOW - A Multimodal Approach for Mobile Maintenance Applications
WETICE '06 Proceedings of the 15th IEEE International Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises
A distributed staged architecture for multimodal applications
ECSA'07 Proceedings of the First European conference on Software Architecture
Hi-index | 0.00 |
Next generation applications will support a variety of modalities that will be provided by more than one device. Users can carry multiple devices with them or just use fixed devices in their environment. The applications will adapt themselves with respect to rendering according to changes in the user's context. For enabling a wide-spread use of those applications, an infrastructure has to be developed that supports a convenient authoring process for applications as well as a comprehensive runtime, which provides both multimodality and context services. This paper describes the Multimodality Services Component as part of a context-aware runtime for multimodal applications, which was developed as part of the EMODE project. In particular, the Multimodality Services Component is responsible for enabling multimodal interaction with EMODE applications and for the adaptation and transformation of modality-independent user interface descriptions into modality-specific parts, which can be used for generating user interfaces on the used devices. Furthermore, the Multimodality Services Component performs input coordination and interaction with the associated business logic.