Distributed pointing for multimodal collaboration over sketched diagrams

  • Authors:
  • Paulo Barthelmess;Ed Kaiser;Xiao Huang;David Demirdjian

  • Affiliations:
  • Oregon Health & Science University, Beaverton, OR;Oregon Health & Science University, Beaverton, OR;Oregon Health & Science University, Beaverton, OR;Massachusetts Institute of Technology

  • Venue:
  • ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
  • Year:
  • 2005

Quantified Score

Hi-index 0.00

Visualization

Abstract

A problem faced by groups that are not co-located but need to collaborate on a common task is the reduced access to the rich multimodal communicative context that they would have access to if they were collaborating face-to-face. Collaboration support tools aim to reduce the adverse effects of this restricted access to the fluid intermixing of speech, gesturing, writing and sketching by providing mechanisms to enhance the awareness of distributed participants of each others' actions.In this work we explore novel ways to leverage the capabilities of multimodal context-aware systems to bridge co-located and distributed collaboration contexts. We describe a system that allows participants at remote sites to collaborate in building a project schedule via sketching on multiple distributed whiteboards, and show how participants can be made aware of naturally occurring pointing gestures that reference diagram constituents as they are performed by remote participants.The system explores the multimodal fusion of pen, speech and 3D gestures, coupled to the dynamic construction of a semantic representation of the interaction, anchored on the sketched diagram, to provide feedback that overcomes some of the intrinsic ambiguities of pointing gestures.