Bricks: laying the foundations for graspable user interfaces
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Phidgets: easy development of physical interfaces through physical widgets
Proceedings of the 14th annual ACM symposium on User interface software and technology
DiamondSpin: an extensible toolkit for around-the-table interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Papier-Mache: toolkit support for tangible input
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes
Proceedings of the 20th annual ACM symposium on User interface software and technology
A multitouch software architecture
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
LADDER, a sketching language for user interface developers
Computers and Graphics
PyMT: a post-WIMP multi-touch user interface toolkit
Proceedings of the ACM International Conference on Interactive Tabletops and Surfaces
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
A flexible approach to gesture recognition and interaction in X3D
Proceedings of the 17th International Conference on 3D Web Technology
Hi-index | 0.00 |
Proliferation of novel types of gesture-based user interfaces has led to considerable fragmentation, both in terms of program code and in terms of the gestures themselves. Consequently, it is difficult for developers to build on previous work, thereby consuming valuable development time. Moreover, the flexibility of the resulting user interface is limited, particularly in respect to users wishing to customize the interface. To address this problem, we present a generic and extensible formal language to describe gestures. This language is applicable to a wide variety of input devices, such as multi-touch surfaces, pen-based input, tangible objects and even free-hand gestures. It enables the development of a generic gesture recognition engine which can serve as a backend to a wide variety of user interfaces. Moreover, rapid customization of the interface becomes possible by simply swapping gesture definitions - an aspect which has considerable advantages when conducting UI research or porting an existing application to a new type of input device. Developers will be able to benefit from the reduced amount of code, while users will be able to benefit from the increased flexibility through customization afforded by this approach.