Gestures as point clouds: a $P recognizer for user interface prototypes

  • Authors:
  • Radu-Daniel Vatavu;Lisa Anthony;Jacob O. Wobbrock

  • Affiliations:
  • University Stefan cel Mare of Suceava, Suceava, Romania;University of Maryland Baltimore County, Baltimore, MD, USA;University of Washington, Seattle, WA, USA

  • Venue:
  • Proceedings of the 14th ACM international conference on Multimodal interaction
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Rapid prototyping of gesture interaction for emerging touch platforms requires that developers have access to fast, simple, and accurate gesture recognition approaches. The $-family of recognizers ($1, $N) addresses this need, but the current most advanced of these, $N-Protractor, has significant memory and execution costs due to its combinatoric gesture representation approach. We present $P, a new member of the $-family, that remedies this limitation by considering gestures as clouds of points. $P performs similarly to $1 on unistrokes and is superior to $N on multistrokes. Specifically, $P delivers 99% accuracy in user-dependent testing with 5+ training samples per gesture type and stays above 99% for user-independent tests when using data from 10 participants. We provide a pseudocode listing of $P to assist developers in porting it to their specific platform and a "cheat sheet" to aid developers in selecting the best member of the $-family for their specific application needs.