User identification using raw sensor data from typing on interactive displays

  • Authors:
  • Philipp Mock;Jörg Edelmann;Andreas Schilling;Wolfgang Rosenstiel

  • Affiliations:
  • University of Tübingen, Tübingen, Germany;Knowledge Media Research Center, Tübingen, Germany;University of Tübingen, Tübingen, Germany;University of Tübingen, Tübingen, Germany

  • Venue:
  • Proceedings of the 19th international conference on Intelligent User Interfaces
  • Year:
  • 2014

Quantified Score

Hi-index 0.00

Visualization

Abstract

Personalized soft-keyboards which adapt to a user's individual typing behavior can reduce typing errors on interactive displays. In multi-user scenarios a personalized model has to be loaded for each participant. In this paper we describe a user identification technique that is based on raw sensor data from an optical touch screen. For classification of users we use a multi-class support vector machine that is trained with grayscale images from the optical sensor. Our implementation can identify a specific user from a set of 12 users with an average accuracy of 97.51% after one keystroke. It can be used to automatically select individual typing models during free-text entry. The resulting authentication process is completely implicit. We furthermore describe how the approach can be extended to automatic loading of personal information and settings.