Towards a unified gesture description language

  • Authors:
  • Florian Echtler;Gudrun Klinker;Andreas Butz

  • Affiliations:
  • Munich Univ. of Applied, Sciences;Technische Universität, München;Ludwig-Maximilians-Universität

  • Venue:
  • Proceedings of the 13th International Conference on Humans and Computers
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Proliferation of novel types of gesture-based user interfaces has led to considerable fragmentation, both in terms of program code and in terms of the gestures themselves. Consequently, it is difficult for developers to build on previous work, thereby consuming valuable development time. Moreover, the flexibility of the resulting user interface is limited, particularly in respect to users wishing to customize the interface. To address this problem, we present a generic and extensible formal language to describe gestures. This language is applicable to a wide variety of input devices, such as multi-touch surfaces, pen-based input, tangible objects and even free-hand gestures. It enables the development of a generic gesture recognition engine which can serve as a backend to a wide variety of user interfaces. Moreover, rapid customization of the interface becomes possible by simply swapping gesture definitions - an aspect which has considerable advantages when conducting UI research or porting an existing application to a new type of input device. Developers will be able to benefit from the reduced amount of code, while users will be able to benefit from the increased flexibility through customization afforded by this approach.