interactions
A hand gesture interface device
CHI '87 Proceedings of the SIGCHI/GI Conference on Human Factors in Computing Systems and Graphics Interface
User performance in relation to 3D input device design
ACM SIGGRAPH Computer Graphics
The cubic mouse: a new device for three-dimensional input
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
ToolStone: effective use of the physical manipulation vocabularies of input devices
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
XWand: UI for intelligent spaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Invariant features for 3-D gesture recognition
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Using Multiple Sensors for Mobile Sign Language Recognition
ISWC '03 Proceedings of the 7th IEEE International Symposium on Wearable Computers
Georgia tech gesture toolkit: supporting experiments in gesture recognition
Proceedings of the 5th international conference on Multimodal interfaces
Using an autonomous cube for basic navigation and input
Proceedings of the 5th international conference on Multimodal interfaces
VisionWand: interaction techniques for large displays using a passive wand tracked in 3D
Proceedings of the 16th annual ACM symposium on User interface software and technology
Visual touchpad: a two-handed gestural input device
Proceedings of the 6th international conference on Multimodal interfaces
A study of hand shape use in tabletop gesture interaction
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Gameplay issues in the design of spatial 3D gestures for video games.
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
We propose a novel versatile gesture input device called the mCube to support both desktop and hand-held interactions in ubiquitous computing environments. It allows for desktop interactions by moving the device on a planar surface, like a computer mouse. By lifting the device from the surface, users can seamlessly continue handheld interactions in the same application. Since mCube is a single completely wireless device, it can be carried and used for different display platforms. We explore the use of multiple sensors to support a wide range of tasks namely gesture commands, multi-dimensional manipulation and navigation, and tool selections on a pie-menu. This paper addresses the design and implementation of the device with a set of design principles, and demonstrates its exploratory interaction techniques.We also discuss the results of a user evaluation and future directions.