Input finger detection for nonvisual touch screen text entry in Perkinput

  • Authors:
  • Shiri Azenkot;Jacob O. Wobbrock;Sanjana Prasain;Richard E. Ladner

  • Affiliations:
  • University of Washington, Seattle, WA;University of Washington, Seattle, WA;University of Washington, Seattle, WA;University of Washington, Seattle, WA

  • Venue:
  • Proceedings of Graphics Interface 2012
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present Input Finger Detection (IFD), a novel technique for nonvisual touch screen input, and its application, the Perkinput text entry method. With IFD, signals are input into a device with multi-point touches, where each finger represents one bit, either touching the screen or not. Maximum likelihood and tracking algorithms are used to detect which fingers touch the screen based on user-set reference points. The Perkinput text entry method uses the 6-bit Braille encoding with audio feedback, enabling one- and two-handed input. A longitudinal evaluation with 8 blind participants who are proficient in Braille showed that one-handed Perkinput was significantly faster and more accurate than iPhone's VoiceOver. Furthermore, in a case study to evaluate expert performance, one user reached an average session speed of 17.56 words per minute (WPM) with an average uncorrected error rate of just 0.14% using one hand for input. The same participant reached an average session speed of 38.0 WPM with two-handed input and an error rate of just 0.26%. Her fastest phrase was entered at 52.4 WPM and no errors.