Future trends in computer graphics: how much is enough?

  • Authors:
  • A. R. Forrest

  • Affiliations:
  • School of Computing Sciences, University of East Anglia, Norwich, U.K.

  • Venue:
  • Journal of Computer Science and Technology
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

Over the forty-year history of interactive computer graphics, there have been continuous advances, but at some stage this progression must terminate with images being sufficiently realistic for all practical purposes. How much detail do we really need? Polygon counts over a few million imply that on average each polygon paints less than a single pixel, making use of polygon shading hardware wasteful. We consider the problem of determining how much realism is required for a variety of applications. We discuss how current trends in computer graphics hardware, and in particular of graphics cards targeted at the computer games industry, will help or hinder achievement of these requirements. With images now being so convincingly realistic in many cases, critical faculties are often suspended and the images are accepted as correct and truthful although they may well be incorrect and sometimes misleading or untruthful. Display resolution has remained largely constant in spatial terms for the last twenty years and in terms of the number of pixels has increased by less than an order of magnitude. If the long-promised breakthroughs in display technology are finally realised, how should we use the increased resolution?