More than meets the eye: An engineering study to empirically examine the blending of real and virtual color spaces

  • Authors:
  • Joseph L. Gabbard;J Edward Swan;Jason Zedlitz;Woodrow W. Winchester

  • Affiliations:
  • Center for Human-Comput. Interaction, Virginia Tech, Blacksburg, VA, USA;Ind. Syst. Eng., Virginia Tech, Blacksburg, VA, USA;Comput. Sci. & Eng., Mississippi State Univ., Starkville, MI, USA;Ind. Syst. Eng., Virginia Tech, Blacksburg, VA, USA

  • Venue:
  • VR '10 Proceedings of the 2010 IEEE Virtual Reality Conference
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

It is well-documented that natural lighting conditions and real-world backgrounds affect the usability of optical see-through augmented reality (AR) displays in outdoor environments. In many cases, outdoor environmental conditions can dramatically alter users' color perception of user interface elements, by for example, washing out text or icon colors. As a result, users' semantic interpretation of interface elements can be compromised, rendering interface designs useless or counter-productive - an especially critical problem in application domains where color encoding is critical, such as military or medical visualization. In this paper, we present our experiences designing and constructing an optical AR testbed that emulates outdoor lighting conditions and allows us to measure the combined color of real-world backgrounds and virtual colors as projected through an optical see-through display. We present a formalization of color blending in AR, which supports further research on perceived color in AR displays. We describe an engineering study where we measure the color of light that reaches an optical see-through display user's eye under systematically varied virtual and real-world conditions. Our results further quantify the effect of lighting and background color on the color of virtual graphics, and specifically quantify how virtual colors change based on different real-world backgrounds.