Incorporating tilt-based interaction in multimodal user interfaces for mobile devices

  • Authors:
  • Jani Mäntyjärvi;Fabio Paternò;Carmen Santoro

  • Affiliations:
  • VTT Technical Research Centre, Oulu, Finland;ISTI-CNR, Pisa, Italy;ISTI-CNR, Pisa, Italy

  • Venue:
  • TAMODIA'06 Proceedings of the 5th international conference on Task models and diagrams for users interface design
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

Emerging ubiquitous environments raise the need to support multiple interaction modalities in diverse types of devices. Designing multimodal interfaces for ubiquitous environments using development tools creates challenges since target platforms support different resources and interfaces. Model-based approaches have been recognized as useful for managing the increasing complexity consequent to the many available interaction platforms. However, they have usually focused on graphical and/or vocal modalities. This paper presents a solution for enabling the development of tilt-based hand gesture and graphical modalities for mobile devices in a multimodal user interface development tool. The challenges related to developing gesture-based applications for various types of devices involving mobile devices are discussed in detail. The possible solution presented is based on a logical description language for hand-gesture user interfaces. Such language allows us to obtain a user interface implementation on the target mobile platform. The solution is illustrated with an example application that can be accessed from both the desktop and mobile device supporting tilt-based gesture interaction.