Lucid touch: a see-through mobile device
Proceedings of the 20th annual ACM symposium on User interface software and technology
The performance of hand postures in front- and back-of-device interaction for mobile computing
International Journal of Human-Computer Studies
Back-of-device interaction allows creating very small touch devices
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Double-side multi-touch input for mobile devices
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Hoverflow: exploring around-device interaction with IR distance sensors
Proceedings of the 11th International Conference on Human-Computer Interaction with Mobile Devices and Services
RearType: text entry using keys on the back of a device
Proceedings of the 12th international conference on Human computer interaction with mobile devices and services
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part I
Fat finger worries: how older and younger users physically interact with PDAs
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Design space for finger gestures with hand-held tablets
Proceedings of the 14th ACM international conference on Multimodal interaction
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services
Design principles of hand gesture interfaces for microinteractions
Proceedings of the 6th International Conference on Designing Pleasurable Products and Interfaces
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
We present research that investigates the amount of guidance required by users for precise back-of-device interaction. We explore how pointing effectiveness is influenced by the presence or absence of visual guidance feedback. Participants were asked to select targets displayed on an iPad device, by touching and releasing them from underneath the device. Another iPad was used to detect finger positions from the rear. Results showed that participants were able to select targets as accurately without visual feedback of finger position as they were with it. Additionally, no significant increase in workload was identified when visual feedback was removed. Our results show that users do not require complex techniques to visualize finger position on the rear of device. Visual feedback does not affect any performance parameters, such as effectiveness, perceived performance, and the number of trials needed to select a target. We also outline the implications of our findings and our future work to fully investigate the effect of visual guidance feedback.