Charade: remote control of objects using free-hand gestures
Communications of the ACM - Special issue on computer augmented environments: back to the real world
Attention and visual feedback: the bimanual frame of reference
Proceedings of the 1997 symposium on Interactive 3D graphics
Moving objects in space: exploiting proprioception in virtual-environment interaction
Proceedings of the 24th annual conference on Computer graphics and interactive techniques
VIDEOPLACE—an artificial reality
CHI '85 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The role of kinesthetic reference frames in two-handed input performance
Proceedings of the 12th annual ACM symposium on User interface software and technology
Kinesthetic cues aid spatial memory
CHI '02 Extended Abstracts on Human Factors in Computing Systems
The role of visual and kinesthetic feedback in the prevention of mode errors
INTERACT '90 Proceedings of the IFIP TC13 Third Interational Conference on Human-Computer Interaction
“Put-that-there”: Voice and gesture at the graphics interface
SIGGRAPH '80 Proceedings of the 7th annual conference on Computer graphics and interactive techniques
GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
Pointing gesture recognition based on 3D-tracking of face, hands and head orientation
Proceedings of the 5th international conference on Multimodal interfaces
Free-Hand Pointer by Use of an Active Stereo Vision System
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 2 - Volume 2
Distant freehand pointing and clicking on very large, high resolution displays
Proceedings of the 18th annual ACM symposium on User interface software and technology
A Keystroke and Pointer Control Input Interface for Wearable Computers
PERCOM '06 Proceedings of the Fourth Annual IEEE International Conference on Pervasive Computing and Communications
Shadow reaching: a new perspective on interaction for large displays
Proceedings of the 20th annual ACM symposium on User interface software and technology
WUW - wear Ur world: a wearable gestural interface
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Brainy hand: an ear-worn hand gesture interaction device
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Designing Laser Gesture Interface for Robot Control
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Virtual shelves: interactions with orientation aware devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Enabling always-available input with muscle-computer interfaces
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Skinput: appropriating the body as an input surface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A comparison of ray pointing techniques for very large displays
Proceedings of Graphics Interface 2010
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
Body-centric interaction techniques for very large wall displays
Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries
Tangential force sensing system on forearm
Proceedings of the 4th Augmented Human International Conference
Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Body-centric design space for multi-surface interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
NailDisplay: bringing an always available visual display to fingertips
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
Hi-index | 0.00 |
This paper presents a novel interaction system, PUB (Point Upon Body), to explore eyes-free interaction in a personal space by allowing users tapping on their own arms to be provided with haptic feedback from their skin. Two user studies determine how users can interact precisely with their forearms and how users behave when operating in their arm space. According to those results, normal users can divide their arm space at most into 6 points between their wrists and elbows with iterative practice. Experimental results also indicate that the divided pattern of each user is unique from that of other ones. Based on the design principles from the observations, an interaction system, PUB, is designed to demonstrate how interaction design benefits from those findings. Two scenarios, remote display control and mobile device control, are demonstrated through the UltraSonic device attached on the users' wrists to detect their tapped positions.