Specifying gestures by example
Proceedings of the 18th annual conference on Computer graphics and interactive techniques
SILK: sketching interfaces like krazy
Conference Companion on Human Factors in Computing Systems
DENIM: finding a tighter fit between tools and practice for Web site design
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
SATIN: a toolkit for informal ink-based applications
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Model-based clustering algorithms, performance and application
Model-based clustering algorithms, performance and application
Georgia tech gesture toolkit: supporting experiments in gesture recognition
Proceedings of the 5th international conference on Multimodal interfaces
Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes
Proceedings of the 20th annual ACM symposium on User interface software and technology
iGesture: A General Gesture Recognition Framework
ICDAR '07 Proceedings of the Ninth International Conference on Document Analysis and Recognition - Volume 02
A toolkit approach to sketched diagram recognition
BCS-HCI '07 Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI...but not as we know it - Volume 1
Automatic evaluation of sketch recognizers
Proceedings of the 6th Eurographics Symposium on Sketch-Based Interfaces and Modeling
GART: the gesture and activity recognition toolkit
HCI'07 Proceedings of the 12th international conference on Human-computer interaction: intelligent multimodal interaction environments
Hi-index | 0.00 |
The world of today and its new technologies like smartphones, tablets, or any flat interaction surface has increasing the need for graphical user interfaces integrating gestural interaction in which 2D pen-based gestures are properly used. Integrating this interaction modality in streamlined software development represents a significant challenge for designers or developers: it requires important knowledge in gestures management, in deciding which gesture recognition algorithm should be used or refined for which types of gestures, or which usability knowledge should be used for supporting the development. These skills usually belong to experts for gesture interaction and not actors usually involved in user interface design process. In this paper, we present a structured method for facilitating the integration of gestures in graphical user interfaces by describing the roles of the gesture specialist and other stakeholders involved in the development life cycle, and the process of cooperation leading to the creation of a gesture-based user interface. The method consists of three pillars: a conceptual model for describing gestures on top of graphical user interfaces and its associated language, a step-wise approach for defining gestures depending on the end user's task, and a software that supports this approach. This method is exemplified with a running example in the area of document navigation.