An HMM-Based Threshold Model Approach for Gesture Recognition
IEEE Transactions on Pattern Analysis and Machine Intelligence
Interaction techniques for ambiguity resolution in recognition-based interfaces
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
GI '05 Proceedings of Graphics Interface 2005
User-defined motion gestures for mobile interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
DoubleFlip: a motion gesture delimiter for mobile interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Tap, swipe, or move: attentional demands for distracted smartphone input
Proceedings of the International Working Conference on Advanced Visual Interfaces
Tap, swipe, or move: attentional demands for distracted smartphone input
Proceedings of the International Working Conference on Advanced Visual Interfaces
Hacking the Gestures of Past for Future Interactions
Proceedings of International Conference on Advances in Mobile Computing & Multimedia
Teaching motion gestures via recognizer feedback
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
Designers of motion gestures for mobile devices face the difficult challenge of building a recognizer that can separate gestural input from motion noise. A threshold value is often used to classify motion and effectively balances the rates of false positives and false negatives. We present a bi-level threshold recognition technique designed to lower the rate of recognition failures by accepting either a tightly thresholded gesture or two consecutive possible gestures recognized by a relaxed model. Evaluation of the technique demonstrates that the technique can aid in recognition for users who have trouble performing motion gestures. Lastly, we suggest the use of bi-level thresholding to scaffold the learning of gestures.