Multimodal information access across multiple devices

  • Authors:
  • Gerald Huebsch;Kay Kadner

  • Affiliations:
  • Technische Universität Dresden, Dresden, Germany;Technische Universität Dresden, Dresden, Germany and SAP Research CEC Dresden, Dresden, Germany

  • Venue:
  • Mobility '07 Proceedings of the 4th international conference on mobile technology, applications, and systems and the 1st international symposium on Computer human interaction in mobile technology
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Next generation applications will support a variety of modalities that will be provided by more than one device. Users can carry multiple devices with them or just use fixed devices in their environment. The applications will adapt themselves with respect to rendering according to changes in the user's context. For enabling a wide-spread use of those applications, an infrastructure has to be developed that supports a convenient authoring process for applications as well as a comprehensive runtime, which provides both multimodality and context services. This paper describes the Multimodality Services Component as part of a context-aware runtime for multimodal applications, which was developed as part of the EMODE project. In particular, the Multimodality Services Component is responsible for enabling multimodal interaction with EMODE applications and for the adaptation and transformation of modality-independent user interface descriptions into modality-specific parts, which can be used for generating user interfaces on the used devices. Furthermore, the Multimodality Services Component performs input coordination and interaction with the associated business logic.