Digital Ira: creating a real-time photoreal digital actor

  • Authors:
  • Oleg Alexander;Graham Fyffe;Jay Busch;Xueming Yu;Ryosuke Ichikari;Andrew Jones;Paul Debevec;Jorge Jimenez;Etienne Danvoye;Bernardo Antionazzi;Mike Eheler;Zybnek Kysela;Javier von der Pahlen

  • Affiliations:
  • USC Institute for Creative Technologies;USC Institute for Creative Technologies;USC Institute for Creative Technologies;USC Institute for Creative Technologies;USC Institute for Creative Technologies;USC Institute for Creative Technologies;USC Institute for Creative Technologies;Activision, Inc.;Activision, Inc.;Activision, Inc.;Activision, Inc.;Activision, Inc.;Activision, Inc.

  • Venue:
  • ACM SIGGRAPH 2013 Posters
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Overview In 2008, the "Digital Emily" project [Alexander et al. 2009] showed how a set of high-resolution facial expressions scanned in a light stage could be rigged into a real-time photoreal digital character and driven with video-based facial animation techniques. However, Digital Emily was rendered offline, involved just the front of the face, and was never seen in a tight closeup. In this collaboration between Activision and USC ICT shown at SIGGRAPH 2013's Real-Time Live venue, we endeavoured to create a real-time, photoreal digital human character which could be seen from any viewpoint, in any lighting, and could perform realistically from video performance capture even in a tight closeup. In addition, we wanted this to run in a real-time game-ready production pipeline, ultimately achieving 180 frames per second for a full-screen character on a two-year old graphics card.