ARQuake: the outdoor augmented reality gaming system
Communications of the ACM - Internet abuse in the workplace and Game engines in scientific research
ISWC '00 Proceedings of the 4th IEEE International Symposium on Wearable Computers
Twiddler typing: one-handed chording text entry for mobile phones
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Hand tracking for low powered mobile AR user interfaces
AUIC '05 Proceedings of the Sixth Australasian conference on User interface - Volume 40
Mobile pointing and input system using active marker
ISMAR '06 Proceedings of the 5th IEEE and ACM International Symposium on Mixed and Augmented Reality
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Computer Vision: Algorithms and Applications
Computer Vision: Algorithms and Applications
OmniTouch: wearable multitouch interaction everywhere
Proceedings of the 24th annual ACM symposium on User interface software and technology
IEEE Transactions on Signal Processing
ShoeSense: a new perspective on gestural interaction and wearable applications
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Around device interaction for multiscale navigation
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Extending a mobile device's interaction space through body-centric interaction
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Digits: freehand 3D interactions anywhere using a wrist-worn gloveless sensor
Proceedings of the 25th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
We present Mime, a compact, low-power 3D sensor for unencumbered free-form, single-handed gestural interaction with head-mounted displays (HMDs). Mime introduces a real-time signal processing framework that combines a novel three-pixel time-of-flight (TOF) module with a standard RGB camera. The TOF module achieves accurate 3D hand localization and tracking, and it thus enables motion-controlled gestures. The joint processing of 3D information with RGB image data enables finer, shape-based gestural interaction. Our Mime hardware prototype achieves fast and precise 3D gestural control. Compared with state-of-the-art 3D sensors like TOF cameras, the Microsoft Kinect and the Leap Motion Controller, Mime offers several key advantages for mobile applications and HMD use cases: very small size, daylight insensitivity, and low power consumption. Mime is built using standard, low-cost optoelectronic components and promises to be an inexpensive technology that can either be a peripheral component or be embedded within the HMD unit. We demonstrate the utility of the Mime sensor for HMD interaction with a variety of application scenarios, including 3D spatial input using close-range gestures, gaming, on-the-move interaction, and operation in cluttered environments and in broad daylight conditions.