Tilting operations for small screen interfaces
Proceedings of the 9th annual ACM symposium on User interface software and technology
Suede: a Wizard of Oz prototyping tool for speech user interfaces
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Sensing techniques for mobile interaction
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
A design tool for camera-based interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Model-based clustering algorithms, performance and application
Model-based clustering algorithms, performance and application
Maximizing the guessability of symbolic input
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Reflective physical prototyping through integrated design, test, and analysis
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
iSAX: indexing and mining terabyte sized time series
Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
Emulating human perception of motion similarity
Computer Animation and Virtual Worlds - CASA'2008 Special Issue
User-defined gestures for surface computing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
MAGIC: a motion gesture design tool
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
User-defined gestures for connecting mobile phones, public displays, and tabletops
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
User-defined motion gestures for mobile interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Controller-free exploration of medical image data: Experiencing the Kinect
CBMS '11 Proceedings of the 2011 24th International Symposium on Computer-Based Medical Systems
Rewarding the original: explorations in joint user-sensor motion spaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hi-index | 0.00 |
As motion sensors have become more advanced, gesture-control systems have become more popular in gaming and everyday appliances. However, in existing systems, gestures are predefined by designers or pattern-recognition experts. Such predefined gestures can be inconvenient for specific users in specific environments. Hence, it would be useful to provide end users the flexibility to design and customize gestures to satisfy their own needs. In this paper, we present a system that allows end users to design and customize gestures interactively. A key challenge is that arbitrary user-defined gestures can be difficult for the computer to recognize reliably. A gesture may be too similar to frequent unintentional moves, too difficult to distinguish from other gestures, and/or too difficult to perform consistently. Hence, our system first evaluates the user-defined gesture and then gives feedback on its appropriateness to guide the user in the design of appropriate gestures. A user study demonstrated that users were able to design more appropriate gestures with such guidance than without it.