Phidgets: easy development of physical interfaces through physical widgets
Proceedings of the 14th annual ACM symposium on User interface software and technology
Sketch pad a man-machine graphical communication system
DAC '64 Proceedings of the SHARE design automation workshop
DiamondSpin: an extensible toolkit for around-the-table interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Papier-Mache: toolkit support for tangible input
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Support for input adaptability in the ICON toolkit
Proceedings of the 6th international conference on Multimodal interfaces
Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes
Proceedings of the 20th annual ACM symposium on User interface software and technology
The openinterface framework: a tool for multimodal interaction.
CHI '08 Extended Abstracts on Human Factors in Computing Systems
Bringing physics to the surface
Proceedings of the 21st annual ACM symposium on User interface software and technology
A multitouch software architecture
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Squidy: a zoomable design environment for natural user interfaces
CHI '09 Extended Abstracts on Human Factors in Computing Systems
LADDER, a sketching language for user interface developers
Computers and Graphics
PyMT: a post-WIMP multi-touch user interface toolkit
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Midas: a declarative multi-touch interaction framework
Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction
Towards a formalization of multi-touch gestures
ACM International Conference on Interactive Tabletops and Surfaces
A language to define multi-touch interactions
ACM International Conference on Interactive Tabletops and Surfaces
Towards a unified gesture description language
Proceedings of the 13th International Conference on Humans and Computers
GestIT: a declarative and compositional framework for multiplatform gesture definition
Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems
Hi-index | 0.00 |
We present GISpL, the Gestural Interface Specification Language. GISpL is a formal language which allows both researchers and developers to unambiguously describe the behavior of a wide range of gestural interfaces using a simple JSON-based syntax. GISpL supports a multitude of input modalities, including multi-touch, digital pens, multiple regular mice, tangible interfaces or mid-air gestures. GISpL introduces a novel view on gestural interfaces from a software-engineering perspective. By using GISpL, developers can avoid tedious tasks such as reimplementing the same gesture recognition algorithms over and over again. Researchers benefit from the ability to quickly reconfigure prototypes of gestural UIs on-the-fly, possibly even in the middle of an expert review. In this paper, we present a brief overview of GISpL as well as some usage examples of our reference implementation. We demonstrate its capabilities by the example of a multichannel audio mixer application being used with several different input modalities. Moreover, we present exemplary GISpL descriptions of other gestural interfaces and conclude by discussing its potential applications and future development.