FreeDigiter: A Contact-Free Device for Gesture Control
ISWC '04 Proceedings of the Eighth International Symposium on Wearable Computers
Proceedings of the 7th international conference on Human computer interaction with mobile devices & services
HoverFlow: expanding the design space of around-device interaction
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
Skinput: appropriating the body as an input surface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Cord input: an intuitive, high-accuracy, multi-degree-of-freedom input method for mobile devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Pinstripe: eyes-free continuous input on interactive clothing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device
Proceedings of the 24th annual ACM symposium on User interface software and technology
On-body interaction: armed and dangerous
Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
PalmRC: imaginary palm-based remote control for eyes-free television interaction
Proceedings of the 10th European conference on Interactive tv and video
ShoeSoleSense: proof of concept for a wearable foot interface for virtual and real environments
Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology
Hi-index | 0.00 |
In this work-in-progress paper, we make a case for leveraging the unique affordances of the human ear for eyes-free, mobile interaction. We present EarPut, a novel interface concept, which instruments the ear as an interactive surface for touch-based interactions and its prototypical hardware implementation. The central idea behind EarPut is to go beyond prior work by unobtrusively augmenting a variety of accessories that are worn behind the ear, such as headsets or glasses. Results from a controlled experiment with 27 participants provide empirical evidence that people are able to target salient regions on their ear effectively and precisely. Moreover, we contribute a first, systematically derived interaction design space for ear-based interaction and a set of exemplary applications.