Dual device user interface design: PDAs and interactive television
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Voice as sound: using non-verbal voice input for interactive control
Proceedings of the 14th annual ACM symposium on User interface software and technology
Toward a unified universal remote console standard
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Enabling fast and effortless customisation in accelerometer based gesture interaction
Proceedings of the 3rd international conference on Mobile and ubiquitous multimedia
GI '05 Proceedings of Graphics Interface 2005
Towards keyboard independent touch typing in VR
Proceedings of the ACM symposium on Virtual reality software and technology
Telebuddies: social stitching with interactive television
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Watching together: integrating text chat with video
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
WUW - wear Ur world: a wearable gestural interface
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Brainy hand: an ear-worn hand gesture interaction device
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Skinput: appropriating the body as an input surface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Combining multiple depth cameras and projectors for interactions on, above and between surfaces
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Using a depth camera as a touch sensor
ACM International Conference on Interactive Tabletops and Surfaces
Imaginary phone: learning imaginary interfaces by transferring spatial memory from a familiar device
Proceedings of the 24th annual ACM symposium on User interface software and technology
OmniTouch: wearable multitouch interaction everywhere
Proceedings of the 24th annual ACM symposium on User interface software and technology
Designing gestural interfaces for the interactive TV
Proceedings of the 11th european conference on Interactive TV and video
Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Body-centric design space for multi-surface interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
EarPut: augmenting behind-the-ear devices for ear-based interaction
CHI '13 Extended Abstracts on Human Factors in Computing Systems
MoveRC: attention-aware remote control
Proceedings of the 19th Brazilian symposium on Multimedia and the web
Hi-index | 0.00 |
User input on television (TV) typically requires a mediator device, such as a handheld remote control. While being a well-established interaction paradigm, a handheld device has serious drawbacks: it can be easily misplaced due to its mobility and in case of a touch screen interface, it also requires additional visual attention. Emerging interaction paradigms like 3D mid-air gestures using novel depth sensors, such as Microsoft's Kinect, aim at overcoming these limitations, but are known to be e.g. tiring. In this paper, we propose to leverage the palm as an interactive surface for TV remote control. Our contribution is three-fold: (1) we explore the conceptual design space in an exploratory study. (2) Based upon these results, we investigate the effectiveness and accuracy of such an interface in a controlled experiment. And (3), we contribute PalmRC: an eyes-free, palm-surface-based TV remote control, which in turn is evaluated in an early user feedback session. Our results show that the palm has the potential to be leveraged for device-less and eyes-free TV remote interaction without any third-party mediator device.