GISpL: gestures made easy

  • Authors:
  • Florian Echtler;Andreas Butz

  • Affiliations:
  • Munich Univ. of Applied Sciences/Siemens Corporate Technology;Ludwig-Maximilians-Universität, München

  • Venue:
  • Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present GISpL, the Gestural Interface Specification Language. GISpL is a formal language which allows both researchers and developers to unambiguously describe the behavior of a wide range of gestural interfaces using a simple JSON-based syntax. GISpL supports a multitude of input modalities, including multi-touch, digital pens, multiple regular mice, tangible interfaces or mid-air gestures. GISpL introduces a novel view on gestural interfaces from a software-engineering perspective. By using GISpL, developers can avoid tedious tasks such as reimplementing the same gesture recognition algorithms over and over again. Researchers benefit from the ability to quickly reconfigure prototypes of gestural UIs on-the-fly, possibly even in the middle of an expert review. In this paper, we present a brief overview of GISpL as well as some usage examples of our reference implementation. We demonstrate its capabilities by the example of a multichannel audio mixer application being used with several different input modalities. Moreover, we present exemplary GISpL descriptions of other gestural interfaces and conclude by discussing its potential applications and future development.