Toward more sensitive mobile phones
Proceedings of the 14th annual ACM symposium on User interface software and technology
Tap input as an embedded interaction method for mobile devices
Proceedings of the 1st international conference on Tangible and embedded interaction
Lucid touch: a see-through mobile device
Proceedings of the 20th annual ACM symposium on User interface software and technology
Blindsight: eyes-free access to mobile phones
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Whack gestures: inexact and inattentive interaction with mobile devices
Proceedings of the fourth international conference on Tangible, embedded, and embodied interaction
Sensor synaesthesia: touch in motion, and motion in touch
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Forcetap: extending the input vocabulary of mobile touch screens by adding tap gestures
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
PocketTouch: through-fabric capacitive touch input
Proceedings of the 24th annual ACM symposium on User interface software and technology
GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones
Proceedings of the 25th annual ACM symposium on User interface software and technology
Hi-index | 0.00 |
We present BackTap, an interaction technique that extends the input modality of a smartphone to add four distinct tap locations on the back case of a smartphone. The BackTap interaction can be used eyes-free with the phone in a user's pocket, purse, or armband while walking, or while holding the phone with two hands so as not to occlude the screen with the fingers. We employ three common built-in sensors on the smartphone (microphone, gyroscope, and accelerometer) and feature a lightweight heuristic implementation. In an evaluation with eleven participants and three usage conditions, users were able to tap four distinct points with 92% to 96% accuracy.