Privacy through pseudonymity in user-adaptive systems

  • Authors:
  • Alfred Kobsa;Jörg Schreck

  • Affiliations:
  • University of California, Irvine, CA;University of Essen

  • Venue:
  • ACM Transactions on Internet Technology (TOIT)
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

User-adaptive applications cater to the needs of each individual computer user, taking for example users' interests, level of expertise, preferences, perceptual and motoric abilities, and the usage environment into account. Central user modeling servers collect and process the information about users that different user-adaptive systems require to personalize their user interaction.Adaptive systems are generally better able to cater to users the more data their user modeling systems collect and process about them. They therefore gather as much data as possible and "lay them in stock" for possible future usage. Moreover, data collection usually takes place without users' initiative and sometimes even without their awareness, in order not to cause distraction. Both is in conflict with users' privacy concerns that became manifest in numerous recent consumer polls, and with data protection laws and guidelines that call for parsimony, purpose-orientation, and user notification or user consent when personal data are collected and processed.This article discusses security requirements to guarantee privacy in user-adaptive systems and explores ways to keep users anonymous while fully preserving personalized interaction with them. User anonymization in personalized systems goes beyond current models in that not only users must remain anonymous, but also the user modeling system that maintains their personal data. Moreover, users' trust in anonymity can be expected to lead to more extensive and frank interaction, hence to more and better data about the user, and thus to better personalization. A reference model for pseudonymous and secure user modeling is presented that meets many of the proposed requirements.