Multimodal Interaction During Multiparty Dialogues: Initial Results

  • Authors:
  • Philip R. Cohen;Rachel Coulston;Kelly Krout

  • Affiliations:
  • Oregon Health & Science University;Oregon Health & Science University;Oregon Health & Science University

  • Venue:
  • ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

Groups of people involved in collaboration on a task often incorporate the objects in their mutual environment into their discussion. With this comes physical reference to these 3-D objects, including: gesture, gaze, haptics, and possibly other modalities, over and above the speech we commonly associate with human-human communication. From a technological perspective, this human style of communication not only poses the challenge for researchers to create multimodal systems capable of integrating input from various modalities, but also to do it well enough that it supports 驴 but does not interfere with 驴 the primary goal of the collaborators, which is their own human-human interaction. This paper offers a first steptowards building such multimodal systems for supporting face-to-face collaborative work by providing both qualitative and quantitative analyses of multiparty multimodal dialogues in a field setting.