“Body coupled FingerRing”: wireless wearable keyboard
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
ISWC '99 Proceedings of the 3rd IEEE International Symposium on Wearable Computers
Lightglove: Wrist-Worn Virtual Typing and Pointing
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
Synchronous gestures for multiple persons and computers
Proceedings of the 16th annual ACM symposium on User interface software and technology
Lucid touch: a see-through mobile device
Proceedings of the 20th annual ACM symposium on User interface software and technology
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Comparison of three one-question, post-task usability questionnaires
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Double-side multi-touch input for mobile devices
CHI '09 Extended Abstracts on Human Factors in Computing Systems
HoverFlow: expanding the design space of around-device interaction
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Enabling always-available input with muscle-computer interfaces
Proceedings of the 22nd annual ACM symposium on User interface software and technology
MagiTact: interaction with mobile devices based on compass (magnetic) sensor
Proceedings of the 15th international conference on Intelligent user interfaces
Skinput: appropriating the body as an input surface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Characteristics of pressure-based input for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Touch input on curved surfaces
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
AnglePose: robust, precise capacitive touch tracking via 3d orientation estimation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
Robust hand gesture recognition based on finger-earth mover's distance with a commodity depth camera
MM '11 Proceedings of the 19th ACM international conference on Multimedia
PinchPad: performance of touch-based gestures while grasping devices
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ShoeSense: a new perspective on gestural interaction and wearable applications
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
ISWC '12 Proceedings of the 2012 16th Annual International Symposium on Wearable Computers (ISWC)
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
ShoeSoleSense: proof of concept for a wearable foot interface for virtual and real environments
Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology
Hi-index | 0.00 |
We present a wearable interface that consists of motion sensors. As the interface can be worn on the user's fingers (as a ring) or fixed to it (with nail polish), the device controlled by finger gestures can be any generic object, provided they have an interface for receiving the sensor's signal. We implemented four gestures: tap, release, swipe, and pitch, all of which can be executed with a finger of the hand holding the device. In a user study we tested gesture appropriateness for the index finger at the back of a handheld tablet that offered three different form factors on its rear: flat, convex, and concave (undercut). For all three shapes, the gesture performance was equally good, however pitch performed better on all surfaces than swipe. The proposed interface is an example towards the idea of ubiquitous computing and the vision of seamless interactions with grasped objects. As an initial application scenario we implemented a camera control that allows the brightness to be configured using our tested gestures on a common SLR device.