Multimodal user interfaces for a travel assistant

  • Authors:
  • Alain Goyé;Eric Lecolinet;Shiuan-Sung Lin;Gérard Chollet;Catherine Pelachaud;Xiaoqing Ding;Yang Ni

  • Affiliations:
  • ENST;ENST;ENST;ENST;Université de Paris VIII;Tsinghua University;Institut National des Télécommunications

  • Venue:
  • IHM 2003 Proceedings of the 15th French-speaking conference on human-computer interaction on 15eme Conference Francophone sur l'Interaction Homme-Machine
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

As a part of a project to develop a personal assistant for travellers, we have studied three types of multimodal interfaces for a PDA: 1) a combination of Control menus and vocal inputs to control zoomable user interfaces to graphical or textual databases, 2) refinement of pictures captured by the integrated camera, based on correlating a series of pictures, in order to enhance character recognition, and 3) embodied conversational agents able to communicate via synchronized speech and culturedependent nonverbal behaviors (face, gaze and gesture). We describe in this paper these three modalities and their integration in the main application.