An Interactive Computer Vision System DyPERS: Dynamic Personal Enhanced Reality System

  • Authors:
  • Bernt Schiele;Nuria Oliver;Tony Jebara;Alex Pentland

  • Affiliations:
  • -;-;-;-

  • Venue:
  • ICVS '99 Proceedings of the First International Conference on Computer Vision Systems
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

DyPERS, 'Dynamic Personal Enhanced Reality System', uses augmented reality and computer vision to autonomously retrieve 'media memories' based on associations with real objects the user encounters. These are evoked as audio and video clips relevant for the user and overlayed on top of real objects the user encounters. The system utilizes an adaptive, audio-visual learning system on a tetherless wearable computer. The user's visual and auditory scene is stored in real-time by the system (upon request) and is then associated (by user input) with a snap shot of a visual object. The object acts as a key such that when the real-time vision system detects its presence in the scene again, DyPERS plays back the appropriate audio-visual sequence. The vision system is a probabilistic algorithm which is capable of discriminating between hundreds of everyday objects under varying viewing conditions (view changes, lighting, etc.). Once an audio-visual clip is stored, the vision system automatically recalls it and plays it back when it detects the object that the user wished to use to remind him of the sequence. The DyPERS interface augments the user without encumbering him and effectively mimics a form of audio-visual memory. First results on performance and usability are shown.