Gestural and audio metaphors as a means of control for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal 'eyes-free' interaction techniques for wearable devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Towards keyboard independent touch typing in VR
Proceedings of the ACM symposium on Virtual reality software and technology
Earpod: eyes-free menu selection using touch input and reactive audio feedback
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
BodySpace: inferring body pose for natural control of a music player
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Blindsight: eyes-free access to mobile phones
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility
Investigating touchscreen accessibility for people with visual impairments
Proceedings of the 5th Nordic conference on Human-computer interaction: building bridges
WUW - wear Ur world: a wearable gestural interface
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Brainy hand: an ear-worn hand gesture interaction device
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Motion marking menus: An eyes-free approach to motion input for handheld devices
International Journal of Human-Computer Studies
Virtual shelves: interactions with orientation aware devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility
Skinput: appropriating the body as an input surface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Body-centric interaction techniques for very large wall displays
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Leveraging proprioception to make mobile phones more accessible to users with visual impairments
Proceedings of the 12th international ACM SIGACCESS conference on Computers and accessibility
Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device
Proceedings of the 24th annual ACM symposium on User interface software and technology
OmniTouch: wearable multitouch interaction everywhere
Proceedings of the 24th annual ACM symposium on User interface software and technology
Pub - point upon body: exploring eyes-free interaction and methods on an arm
Proceedings of the 24th annual ACM symposium on User interface software and technology
Mobile Lorm Glove: introducing a communication device for deaf-blind people
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
Spatial gestures using a tactile-proprioceptive display
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
PalmRC: imaginary palm-based remote control for eyes-free television interaction
Proceedings of the 10th European conference on Interactive tv and video
Extending a mobile device's interaction space through body-centric interaction
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Hi-index | 0.01 |
Imaginary Interfaces are screen-less ultra-mobile interfaces. Previously we showed that even though they offer no visual feedback they allow users to interact spatially, e.g., by pointing at a location on their non-dominant hand. The primary goal of this paper is to provide a deeper understanding of palm-based imaginary interfaces, i.e., why they work. We perform our exploration using an interaction style inspired by interfaces for visually impaired users. We implemented a system that audibly announces target names as users scrub across their palm. Based on this interface, we conducted three studies. We found that (1) even though imaginary interfaces cannot display visual contents, users' visual sense remains the main mechanism that allows users to control the interface, as they watch their hands interact. (2) When we remove the visual sense by blindfolding, the tactile cues of both hands feeling each other in part replace the lacking visual cues, keeping imaginary interfaces usable. (3) While we initially expected the cues sensed by the pointing finger to be most important, we found instead that it is the tactile cues sensed by the palm that allow users to orient themselves most effectively. While these findings are primarily intended to deepen our understanding of Imaginary Interfaces, they also show that eyes-free interfaces located on skin outperform interfaces on physical devices. In particular, this suggests that palm-based imaginary interfaces may have benefits for visually impaired users, potentially outperforming the touchscreen-based devices they use today.