Proceedings of the ACM symposium on Virtual reality software and technology
Proceedings of Graphics Interface 2010
Human-centered visualization environments
Human-centered visualization environments
A collaborative VR visualization environment for offshore engineering projects
Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry
Interaction and co-located collaboration in large projection-based virtual environments
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Visual consistency in rotational manipulation tasks in sheared-perceived virtual environments
EGVE'07 Proceedings of the 13th Eurographics conference on Virtual Environments
Hi-index | 0.00 |
In this paper, we introduce the idea of projecting different image elements from multiple viewpoints and combining them in a single stereoscopic image. We use this technique to enable multi-user interaction and co-located collaboration in large projection-based immersive display systems. Viewing stereoscopic images from a viewpoint outside of the projection viewpoint introduces parallax, a skew distortion of the spatial image, resulting in a misalignment between real and virtual object positions for multiple viewers sharing a single-view stereoscopic display system. With multi-viewpoint images, we can project different image elements from multiple viewpoints, corresponding to the viewing positions of multiple users, and combine them in a single image. We use this technique to project interaction elements for each user in the correct position and depth, matching, from the userýs point of view, the tracked real positions of interaction devices with the virtual position of visual and functional interaction elements such as pointers, menus or picking rays. We introduce a rendering method to combine projections for different viewpoints in a single, consistent stereoscopic image. We have used multi-viewpoint images for interaction in applications, where a large audience is looking at a non-head-tracked immersive presentation and a guide user is controlling the application with direct interaction techniques. Furthermore we have developed complex interaction scenarios, where multiple users share a conventional single-view projection-based display environment for co-located collaboration.