Human Body Model Acquisition and Tracking Using Voxel Data
International Journal of Computer Vision
Understanding people pointing: the Perseus system
ISCV '95 Proceedings of the International Symposium on Computer Vision
blue-c API: a multimedia and 3D video enhanced toolkit for collaborative VR and telepresence
VRCAI '04 Proceedings of the 2004 ACM SIGGRAPH international conference on Virtual Reality continuum and its applications in industry
Low-Cost Telepresence for Collaborative Virtual Environments
IEEE Transactions on Visualization and Computer Graphics
CVPR'03 Proceedings of the 2003 IEEE computer society conference on Computer vision and pattern recognition
Hi-index | 0.00 |
We present a simple and intuitive method of user interaction, based on pointing gestures, which can be used with video avatars in a remote collaboration. By connecting the head and fingertip of a user in 3D space we can identify the direction in which they are pointing. Stereo infrared cameras in front of the user, together with an overhead camera, are used to find the user's head and fingertip in a CAVE™-like system. The position of the head is taken to be the top of the user's silhouette, while the location of the user's fingertip is found directly in 3D space by searching the images from the stereo cameras for a match with its location in the overhead camera image in real time. The user can interact with the first object which collides with the pointing ray. In an experimental result, the result of the interaction is shown together with the video avatar which is visible to a remote collaborator.