A mixed reality approach for interactively blending dynamic models with corresponding physical phenomena

  • Authors:
  • John Quarles;Paul Fishwick;Samsun Lampotang;Ira Fischler;Benjamin Lok

  • Affiliations:
  • University of Texas at San Antonio, San Antonio, TX;University of Florida, Gainesville, FL;University of Florida, Gainesville, FL;University of Florida, Gainesville, FL;University of Florida, Gainesville, FL

  • Venue:
  • ACM Transactions on Modeling and Computer Simulation (TOMACS)
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

The design, visualization, manipulation, and implementation of models for computer simulation are key parts of the discipline. Models are constructed as a means to understand physical phenomena as state changes occur over time. One issue that arises is the need to correlate models and their components with the phenomena being modeled. For example, a part of an automotive engine needs to be placed into cognitive context with the diagrammatic icon that represents that part's function. A typical solution to this problem is to display a dynamic model of the engine in one window and the engine's CAD model in another. Users are expected to, on their own, mentally blend the dynamic model and the physical phenomenon into the same context. However, this contextualization is not trivial in many applications. Our approach expands upon this form of user interaction by specifying two ways in which dynamic models and the corresponding physical phenomena may be viewed, and experimented with, within the same human interaction space. We present a methodology and implementation of contextualization for diagram-based dynamic models using an anesthesia machine, and then follow up with a human study of its effects on spatial cognition.