A tangible interface for the AMI content linking device: the automated meeting assistant

  • Authors:
  • Jochen Ehnes

  • Affiliations:
  • Centre for Speech Technology Research, School of Informatics, University of Edinburgh, Edinburgh, UK

  • Venue:
  • HSI'09 Proceedings of the 2nd conference on Human System Interactions
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this Paper we describe our approach to support ongoing meetings with an automated meeting assistant. The system based on the AMIDA Content Linking Device aims at providing relevant documents used in previous meetings for the ongoing meeting based on automatic speech recognition. Once the content linking device finds documents linked to a discussion about a similar subject in a previous meeting, it assumes they may be relevant for the current discussion as well. We believe that the way these documents are offered to the meeting participants is equally important as the way they are found. We developed a projection based mixed reality user interface that lets the documents appear on the table tops in front of the meeting participants. They can hand them over to others or bring them onto the shared projection screen easily if they consider them relevant for others as well. Yet, irrelevant documents do not draw too much attention from the discussion. In this paper we describe the concept and implementation of this user interface and provide some preliminary results.