Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
Implications for a gesture design tool
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
UIML: an appliance-independent XML user interface language
WWW '99 Proceedings of the eighth international conference on World Wide Web
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Model-Based Design and Evaluation of Interactive Applications
Model-Based Design and Evaluation of Interactive Applications
XWand: UI for intelligent spaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
An interaction and product design of gesture based TV remote control
CHI '04 Extended Abstracts on Human Factors in Computing Systems
Design and Development of Multidevice User Interfaces through Multiple Logical Descriptions
IEEE Transactions on Software Engineering
A model-based approach for real-time embedded multimodal systems in military aircrafts
Proceedings of the 6th international conference on Multimodal interfaces
ICARE software components for rapidly developing multimodal interfaces
Proceedings of the 6th international conference on Multimodal interfaces
Novel, minimalist haptic gesture interaction for mobile devices
Proceedings of the third Nordic conference on Human-computer interaction
A transformational approach for multimodal web user interfaces based on UsiXML
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Authoring interfaces with combined use of graphics and voice for both stationary and mobile devices
Proceedings of the working conference on Advanced visual interfaces
Accelerometer-based gesture control for a design environment
Personal and Ubiquitous Computing
A Model-Driven Approach to Content Repurposing
IEEE MultiMedia
Gesture interaction for small handheld devices to support multimedia applications
Journal of Mobile Multimedia
Mobile based multimodal retrieval and navigation of learning objects using a 3D car metaphor
Proceedings of the Third International Conference on Internet Multimedia Computing and Service
Hi-index | 0.00 |
Emerging ubiquitous environments raise the need to support multiple interaction modalities in diverse types of devices. Designing multimodal interfaces for ubiquitous environments using development tools creates challenges since target platforms support different resources and interfaces. Model-based approaches have been recognized as useful for managing the increasing complexity consequent to the many available interaction platforms. However, they have usually focused on graphical and/or vocal modalities. This paper presents a solution for enabling the development of tilt-based hand gesture and graphical modalities for mobile devices in a multimodal user interface development tool. The challenges related to developing gesture-based applications for various types of devices involving mobile devices are discussed in detail. The possible solution presented is based on a logical description language for hand-gesture user interfaces. Such language allows us to obtain a user interface implementation on the target mobile platform. The solution is illustrated with an example application that can be accessed from both the desktop and mobile device supporting tilt-based gesture interaction.