Personalized input: improving ten-finger touchscreen typing through automatic adaptation

  • Authors:
  • Leah Findlater;Jacob Wobbrock

  • Affiliations:
  • University of Maryland, College Park, MD & University of Washington, Seattle, WA, United States;University of Washington, Seattle, WA, United States

  • Venue:
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
  • Year:
  • 2012

Quantified Score

Hi-index 0.01

Visualization

Abstract

Although typing on touchscreens is slower than typing on physical keyboards, touchscreens offer a critical potential advantage: they are software-based, and, as such, the keyboard layout and classification models used to interpret key presses can dynamically adapt to suit each user's typing pattern. To explore this potential, we introduce and evaluate two novel personalized keyboard interfaces, both of which adapt their underlying key-press classification models. The first keyboard also visually adapts the location of keys while the second one always maintains a visually stable rectangular layout. A three-session user evaluation showed that the keyboard with the stable rectangular layout significantly improved typing speed compared to a control condition with no personalization. Although no similar benefit was found for the keyboard that also offered visual adaptation, overall subjective response to both new touchscreen keyboards was positive. As personalized keyboards are still an emerging area of research, we also outline a design space that includes dimensions of adaptation and key-press classification features.