Specifying gestures by example
Proceedings of the 18th annual conference on Computer graphics and interactive techniques
Implications for a gesture design tool
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Visual similarity of pen gestures
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Model for unistroke writing time
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Digital Image Processing: PIKS Inside
Digital Image Processing: PIKS Inside
Helping designers create recognition-enabled interfaces
Multimodal interface for human-machine communication
M/ORIS: a medical/operating room interaction system
Proceedings of the 6th international conference on Multimodal interfaces
SHARK2: a large vocabulary shorthand writing system for pen-based computers
Proceedings of the 17th annual ACM symposium on User interface software and technology
Modeling human performance of pen stroke gestures
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes
Proceedings of the 20th annual ACM symposium on User interface software and technology
Graffiti vs. unistrokes: an empirical comparison
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
OctoPocus: a dynamic guide for learning gesture-based command sets
Proceedings of the 21st annual ACM symposium on User interface software and technology
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using strokes as command shortcuts: cognitive benefits and toolkit support
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 15th international conference on Intelligent user interfaces
Usable gestures for mobile interfaces: evaluating social acceptability
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
MAGIC: a motion gesture design tool
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Protractor: a fast and accurate gesture recognizer
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Understanding users' preferences for surface gestures
Proceedings of Graphics Interface 2010
The effect of sampling rate on the performance of template-based gesture recognizers
ICMI '11 Proceedings of the 13th international conference on multimodal interfaces
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
Designing gestural interfaces for the interactive TV
Proceedings of the 11th european conference on Interactive TV and video
How we gesture towards machines: an exploratory study of user perceptions of gestural interaction
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Activity or product?: drawing and HCI
Proceedings of the International Conference on Multimedia, Interaction, Design and Innovation
Relative accuracy measures for stroke gestures
Proceedings of the 15th ACM on International conference on multimodal interaction
User perceptions of drawing logic diagrams with pen-centric user interfaces
Proceedings of Graphics Interface 2013
Understanding the consistency of users' pen and finger stroke gesture articulation
Proceedings of Graphics Interface 2013
Journal of Ambient Intelligence and Smart Environments
Hi-index | 0.00 |
Our empirical results show that users perceive the execution difficulty of single stroke gestures consistently, and execution difficulty is highly correlated with gesture production time. We use these results to design two simple rules for estimating execution difficulty: establishing the relative ranking of difficulty among multiple gestures; and classifying a single gesture into five levels of difficulty. We confirm that the CLC model does not provide an accurate prediction of production time magnitude, and instead show that a reasonably accurate estimate can be calculated using only a few gesture execution samples from a few people. Using this estimated production time, our rules, on average, rank gesture difficulty with 90% accuracy and rate gesture difficulty with 75% accuracy. Designers can use our results to choose application gestures, and researchers can build on our analysis in other gesture domains and for modeling gesture performance.