A taxonomy of (real and virtual world) display and control interactions

  • Authors:
  • Paul Milgram

  • Affiliations:
  • University of Toronto

  • Venue:
  • Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Interactions between a human operator and objects in his/her world afford a wide range of possible viewing and control options, which can vary with respect to time, space, proximity, and frame of reference. Whereas it is conventionally recognised that essentially any kind of interaction metaphor can in principle be simulated in a virtual environment, modern image processing technology now permits greatly increased flexibility also for real world interactions with indirect viewing, Äî as the common video camera has gone beyond being a simple eye on the (remote) real world to being an instrument that is able to integrate and interpolate real world images spatially and temporally, in real time. In the talk, I shall propose a framework for classifying viewing and manipulation interactions for any task where visual feed back is provided. The framework involves identifying key components of the environment that are central to the interaction, in terms of the multidimensional couplings among the components, where the configuration and characteristics of the couplings determine the nature of the visual and manual control interactions experienced by the operator. The framework is domain independent, and is intended to be used both to classify current display-control interactions and to identify future areas of research.