Perceptual user interfaces (introduction)
Communications of the ACM
XWand: UI for intelligent spaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality
Proceedings of the 5th international conference on Multimodal interfaces
VisionWand: interaction techniques for large displays using a passive wand tracked in 3D
Proceedings of the 16th annual ACM symposium on User interface software and technology
Bare-hand human-computer interaction
Proceedings of the 2001 workshop on Perceptive user interfaces
Enabling fast and effortless customisation in accelerometer based gesture interaction
Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia
Interacting with large displays from a distance with vision-tracked multi-finger gestural input
Proceedings of the 18th annual ACM symposium on User interface software and technology
MOCA: a low-power, low-cost motion capture system based on integrated accelerometers
Advances in Multimedia
Performance-driven motion choreographing with accelerometers
Computer Animation and Virtual Worlds - CASA' 2009 Special Issue
Action capture with accelerometers
Proceedings of the 2008 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Hi-index | 0.00 |
Perceptual user interface takes advantage of human perceptual capabilities in order to present semantical information in native and natural ways. In this paper, we present a novel approach to provide users with an accelerometer-based interface for interactively controlling not only functions or devices in digital environments but virtual characters in game-like scenarios. A general approach suitable for both PC and mobile platforms is proposed. Its core techniques include automatic generation and preprocessing of training samples and proper setup of machine learning models. Three sample applications are given: gesture-controlled lantern slide system with Wii Remote, gesture recognition system to make phone calls on Nokia N95, and performance-driven motion choreographing system with Xsens MTx. Experimental results show that the recognition rate is over 95% which is quite acceptable for gesture interaction systems.