GestureWrist and GesturePad: Unobtrusive Wearable Interaction Devices
ISWC '01 Proceedings of the 5th IEEE International Symposium on Wearable Computers
Interaction in 4-second bursts: the fragmented nature of attentional resources in mobile HCI
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
GI '05 Proceedings of Graphics Interface 2005
Towards keyboard independent touch typing in VR
Proceedings of the ACM symposium on Virtual reality software and technology
The benefits of augmenting telephone voice menu navigation with visual browsing and search
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Shift: a technique for operating pen-based interfaces using touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A study of out-of-turn interaction in menu-based, IVR, voicemail systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
BodySpace: inferring body pose for natural control of a music player
CHI '07 Extended Abstracts on Human Factors in Computing Systems
Quickdraw: the impact of mobility and on-body placement on device access time
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Empirical evaluation for finger input properties in multi-touch interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
WUW - wear Ur world: a wearable gestural interface
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Brainy hand: an ear-worn hand gesture interaction device
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Abracadabra: wireless, high-precision, and unpowered finger input for very small mobile devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Virtual shelves: interactions with orientation aware devices
Proceedings of the 22nd annual ACM symposium on User interface software and technology
Spatial sketch: bridging between movement & fabrication
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Skinput: appropriating the body as an input surface
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Imaginary interfaces: spatial interaction with empty hands and without visual feedback
UIST '10 Proceedings of the 23nd annual ACM symposium on User interface software and technology
UIST '10 Adjunct proceedings of the 23nd annual ACM symposium on User interface software and technology
Vision-based hand-gesture applications
Communications of the ACM
Enabling mobile microinteractions
Enabling mobile microinteractions
Data miming: inferring spatial object descriptions from human gesture
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Imaginary interfaces: touchscreen-like interaction without the screen
CHI '12 Extended Abstracts on Human Factors in Computing Systems
Leveraging the palm surface as an eyes-free tv remote control
CHI '12 Extended Abstracts on Human Factors in Computing Systems
PalmRC: imaginary palm-based remote control for eyes-free television interaction
Proceedings of the 10th European conference on Interactive tv and video
m+pSpaces: virtual workspaces in the spatially-aware mobile environment
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Clicking blindly: using spatial correspondence to select targets in multi-device environments
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Magic finger: always-available input through finger instrumentation
Proceedings of the 25th annual ACM symposium on User interface software and technology
Understanding palm-based imaginary interfaces: the role of visual and tactile cues when browsing
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Your left hand can do it too!: investigating intermanual, symmetric gesture transfer on touchscreens
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Body-centric design space for multi-surface interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
NailDisplay: bringing an always available visual display to fingertips
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
EarPut: augmenting behind-the-ear devices for ear-based interaction
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Imaginary devices: gesture-based interaction mimicking traditional input devices
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Imaginary reality gaming: ball games without a ball
Proceedings of the 26th annual ACM symposium on User interface software and technology
AirTouch panel: a re-anchorable virtual touch panel
Proceedings of the 21st ACM international conference on Multimedia
Instant user interfaces: repurposing everyday objects as input devices
Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces
Haptic target acquisition to enable spatial gestures in nonvisual displays
Proceedings of Graphics Interface 2013
International Journal of Human-Computer Studies
Hi-index | 0.00 |
We propose a method for learning how to use an imaginary interface (i.e., a spatial non-visual interface) that we call "transfer learning". By using a physical device (e.g. an iPhone) a user inadvertently learns the interface and can then transfer that knowledge to an imaginary interface. We illustrate this concept with our Imaginary Phone prototype. With it users interact by mimicking the use of a physical iPhone by tapping and sliding on their empty non-dominant hand without visual feedback. Pointing on the hand is tracked using a depth camera and touch events are sent wirelessly to an actual iPhone, where they invoke the corresponding actions. Our prototype allows the user to perform everyday task such as picking up a phone call or launching the timer app and setting an alarm. Imaginary Phone thereby serves as a shortcut that frees users from the necessity of retrieving the actual physical device. We present two user studies that validate the three assumptions underlying the transfer learning method. (1) Users build up spatial memory automatically while using a physical device: participants knew the correct location of 68% of their own iPhone home screen apps by heart. (2) Spatial memory transfers from a physical to an imaginary inter-face: participants recalled 61% of their home screen apps when recalling app location on the palm of their hand. (3) Palm interaction is precise enough to operate a typical mobile phone: Participants could reliably acquire 0.95cm wide iPhone targets on their palm-sufficiently large to operate any iPhone standard widget.