“Body coupled FingerRing”: wireless wearable keyboard
Proceedings of the ACM SIGCHI Conference on Human factors in computing systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Paper windows: interaction techniques for digital paper
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
UIST '06 Proceedings of the 19th annual ACM symposium on User interface software and technology
Shift: a technique for operating pen-based interfaces using touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Lucid touch: a see-through mobile device
Proceedings of the 20th annual ACM symposium on User interface software and technology
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Organic user interfaces: designing computers in any way, shape, or form
Communications of the ACM - Organic user interfaces
The performance of hand postures in front- and back-of-device interaction for mobile computing
International Journal of Human-Computer Studies
HandSense: discriminating different ways of grasping and holding a tangible user interface
Proceedings of the 3rd International Conference on Tangible and Embedded Interaction
Double-side multi-touch input for mobile devices
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Hand grip pattern recognition for mobile user interfaces
IAAI'06 Proceedings of the 18th conference on Innovative applications of artificial intelligence - Volume 2
ThumbSpace: generalized one-handed input for touchscreen-based mobile devices
INTERACT'07 Proceedings of the 11th IFIP TC 13 international conference on Human-computer interaction
Grasp sensing for human-computer interaction
Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction
Gestural interaction on the steering wheel: reducing the visual demand
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Inspirational bits: towards a shared understanding of the digital material
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Nenya: subtle and eyes-free mobile input with a magnetically-tracked finger ring
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
What I grasp is what I control: interacting through grasp releases
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
Drag and drop the apple: the semantic weight of words and images in touch-based interaction
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
Tickle: a surface-independent interaction technique for grasp interfaces
Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Novel user interaction styles with flexible/rollable screens
Proceedings of the Biannual Conference of the Italian Chapter of SIGCHI
Hi-index | 0.00 |
This paper focuses on combining front and back device interaction on grasped devices, using touch-based gestures. We designed generic interactions for discrete, continuous, and combined gesture commands that are executed without hand-eye control because the performing fingers are hidden behind a grasped device. We designed the interactions in such a way that the thumb can always be used as a proprioceptive reference for guiding finger movements, applying embodied knowledge about body structure. In a user study, we tested these touch-based interactions for their performance and users' task-load perception. We combined two iPads together back-to-back to form a double-sided touch screen device: the PinchPad. We discuss the main errors that led to a decrease in accuracy, identify stable features that reduce the error rate, and discuss the role of 'body schema' in designing gesture-based interactions where the user cannot see their hands properly.