Video conferencing as a technology to support group work: a review of its failures
CSCW '88 Proceedings of the 1988 ACM conference on Computer-supported cooperative work
The virtual round table - a collaborative augmented multi-user environment
Proceedings of the third international conference on Collaborative virtual environments
The impact of eye gaze on communication using humanoid avatars
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Collaboration in tele-immersive environments
EGVE '02 Proceedings of the workshop on Virtual environments 2002
An immersive 3D video-conferencing system using shared virtual team user environments
Proceedings of the 4th international conference on Collaborative virtual environments
Distributed Virtual Reality: Supporting Remote Collaboration in Vehicle Design
IEEE Computer Graphics and Applications
Small Group Behavior Experiments in the Coven Project
IEEE Computer Graphics and Applications
IEEE Computer Graphics and Applications
3D Live: Real Time Captured Content for Mixed Reality
ISMAR '02 Proceedings of the 1st International Symposium on Mixed and Augmented Reality
Toward a more robust theory and measure of social presence: review and suggested criteria
Presence: Teleoperators and Virtual Environments
Eye Gaze Correction with Stereovision for Video-Teleconferencing
IEEE Transactions on Pattern Analysis and Machine Intelligence
Presence: Teleoperators and Virtual Environments
3-D Virtual Studio for Natural Inter-“Acting”
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hi-index | 0.00 |
In a collaborative system, the level of co-presence, the feeling of being with the remote participants in the same working environment, is very important for natural and efficient task performance. One way to achieve such co-presence is to recreate the participants as real as possible, for instance, with the 3D whole body representation. In this paper, we introduce a method to recreate and immerse tele-operators in a collaborative augmented reality (AR) environment. The method starts with capturing the 3D cloud points of the remote operators and reconstructs them in the shared environment in real time. In order to realize interaction among the participants, the operator’s motion is tracked using a feature extraction and point matching (PM) algorithm. With the participant tracking, various types of 3D interaction become possible.