A Multimodal Interaction Scheme between a Blind User and the Tyflos Assistive Prototype

  • Authors:
  • Nikolaos Bourbakis;Robert Keefer;Dimitrios Dakopoulos;Anna Esposito

  • Affiliations:
  • -;-;-;-

  • Venue:
  • ICTAI '08 Proceedings of the 2008 20th IEEE International Conference on Tools with Artificial Intelligence - Volume 02
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents the multimodal interaction scheme (visual and audio) used by the Tyflos prototype. Tyflos is a wearable prototype that provides reading and navigating assistance for visually impaired users. In particular, the Tyflos prototype integrates a wireless portable computer, cameras, range and GPS sensors, microphones, natural language processor, text-to-speech device, an ear speaker, a speech synthesizer, a 2D vibration vest and a digital audio recorder. Data collected by the Tyflos sensors is processed by appropriate modules, each of which is specialized in one or more tasks. In this paper we also present a Stochastic Petri-net model of the multimodal interaction scheme for both of the Tyflos capabilities, reading and navigation. Simple illustrative examples from reading and navigation cases are also presented to demonstrate the multimodal interaction.