Numerical recipes in C (2nd ed.): the art of scientific computing
Numerical recipes in C (2nd ed.): the art of scientific computing
Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes
Proceedings of the 20th annual ACM symposium on User interface software and technology
Proceedings of the 6th Eurographics Symposium on Sketch-Based Interfaces and Modeling
Protractor: a fast and accurate gesture recognizer
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A lightweight multistroke recognizer for user interface prototypes
Proceedings of Graphics Interface 2010
Gestures as point clouds: a $P recognizer for user interface prototypes
Proceedings of the 14th ACM international conference on Multimodal interaction
Proceedings of the 2012 ACM international conference on Interactive tabletops and surfaces
The impact of motion dimensionality and bit cardinality on the design of 3D gesture recognizers
International Journal of Human-Computer Studies
Memorability of pre-designed and user-defined gesture sets
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Interactive prototyping of tabletop and surface applications
Proceedings of the 5th ACM SIGCHI symposium on Engineering interactive computing systems
Proceedings of the 12th International Conference on Interaction Design and Children
Relative accuracy measures for stroke gestures
Proceedings of the 15th ACM on International conference on multimodal interaction
Understanding the consistency of users' pen and finger stroke gesture articulation
Proceedings of Graphics Interface 2013
Proceedings of the companion publication of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
Prior work introduced $N, a simple multistroke gesture recognizer based on template matching, intended to be easy to port to new platforms for rapid prototyping, and derived from the unistroke $1 recognizer. $N uses an iterative search method to find the optimal angular alignment between two gesture templates, like $1 before it. Since then, Protractor has been introduced, a unistroke pen and finger gesture recognition algorithm also based on template-matching and $1, but using a closed-form template-matching method instead of an iterative search method, considerably improving recognition speed over $1. This paper presents work to streamline $N with Protractor by using Protractor's closed-form matching approach, and demonstrates that similar speed benefits occur for multistroke gestures from datasets from multiple domains. We find that the Protractor enhancements are over 91% faster than the original $N, and negligibly less accurate (e.g., pen vs. finger) have on recognition accuracy, and examine the most confusable gestures.