Relaxing stylus typing precision by geometric pattern matching
Proceedings of the 10th international conference on Intelligent user interfaces
A versatile dataset for text entry evaluations based on genuine mobile emails
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
Input finger detection for nonvisual touch screen text entry in Perkinput
Proceedings of Graphics Interface 2012
An evaluation of BrailleTouch: mobile touchscreen text entry for the visually impaired
MobileHCI '12 Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services
Hi-index | 0.00 |
Typing on a touchscreen keyboard is very difficult without being able to see the keyboard. We propose a new approach in which users imagine a Qwerty keyboard somewhere on the device and tap out an entire sentence without any visual reference to the keyboard and without intermediate feedback about the letters or words typed. To demonstrate the feasibility of our approach, we developed an algorithm that decodes blind touchscreen typing with a character error rate of 18.5%. Our decoder currently uses three components: a model of the keyboard topology and tap variability, a point transformation algorithm, and a long-span statistical language model. Our initial results demonstrate that our proposed method provides fast entry rates and promising error rates. On one-third of the sentences, novices' highly noisy input was successfully decoded with no errors.