SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
A Flexible New Technique for Camera Calibration
IEEE Transactions on Pattern Analysis and Machine Intelligence
Eye gaze correction for videoconferencing
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
A Taxonomy and Evaluation of Dense Two-Frame Stereo Correspondence Algorithms
International Journal of Computer Vision
Free-viewpoint video of human actors
ACM SIGGRAPH 2003 Papers
Gaze Manipulation for One-to-one Teleconferencing
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
OpenGL(R) Shading Language
Eye Gaze Correction with Stereovision for Video-Teleconferencing
IEEE Transactions on Pattern Analysis and Machine Intelligence
View-dependent textured splatting for rendering live scenes
SIGGRAPH '04 ACM SIGGRAPH 2004 Posters
Presence: Teleoperators and Virtual Environments
Immersive Video Teleconferencing with User-Steerable Views
Presence: Teleoperators and Virtual Environments
Virtual View Specification and Synthesis in Free Viewpoint Television Application
3DPVT '06 Proceedings of the Third International Symposium on 3D Data Processing, Visualization, and Transmission (3DPVT'06)
VR '09 Proceedings of the 2009 IEEE Virtual Reality Conference
Natural Eye Motion Synthesis by Modeling Gaze-Head Coupling
VR '09 Proceedings of the 2009 IEEE Virtual Reality Conference
Gaze correction for home video conferencing
ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2012
Hi-index | 0.00 |
In video conferences, one user can either look at the remote image of the other captured by a video camera, or look at the camera that is capturing her/him, but not two tasks at the same time. The lack of eye contact caused by this misalignment substantially reduces the effectiveness of communication with an unpleasant feeling of disconnectedness. We propose an approach to bring eye contact back while user looks at the remote image of the other by a novel system composed by a Time-of-Flight depth sensor and traditional stereo. The key success of this system is to faithfully recover scene's depth. In this 2.5D space, the controlling of the user's eye-gaze becomes relatively easier. To evaluate the performance of the system, we conducted two user studies. One focuses on subjects have been trained to be familiar to eye gaze displayed in images; another is blind evaluation that subjects have no prior knowledge about eye gaze. Both evaluations show that the system can bring desktop participants closer to each other.