Eye contact in video conference via fusion of time-of-flight depth sensor and stereo

  • Authors:
  • Jiejie Zhu;Ruigang Yang;Xueqing Xiang

  • Affiliations:
  • UtopiaCompression Corp., Los Angeles, USA 90064;Computer Science Department, University of Kentucky, Kentucky, USA 40507;Computer Science Department, Zhejiang University, Zhejiang, China 30027

  • Venue:
  • 3D Research
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In video conferences, one user can either look at the remote image of the other captured by a video camera, or look at the camera that is capturing her/him, but not two tasks at the same time. The lack of eye contact caused by this misalignment substantially reduces the effectiveness of communication with an unpleasant feeling of disconnectedness. We propose an approach to bring eye contact back while user looks at the remote image of the other by a novel system composed by a Time-of-Flight depth sensor and traditional stereo. The key success of this system is to faithfully recover scene's depth. In this 2.5D space, the controlling of the user's eye-gaze becomes relatively easier. To evaluate the performance of the system, we conducted two user studies. One focuses on subjects have been trained to be familiar to eye gaze displayed in images; another is blind evaluation that subjects have no prior knowledge about eye gaze. Both evaluations show that the system can bring desktop participants closer to each other.