CU-SeeMe VR immersive desktop teleconferencing
MULTIMEDIA '96 Proceedings of the fourth ACM international conference on Multimedia
A real time anatomical converter for human motion capture
Proceedings of the Eurographics workshop on Computer animation and simulation '96
A QoS architecture for collaborative virtual environments
MULTIMEDIA '99 Proceedings of the seventh ACM international conference on Multimedia (Part 1)
Web Developer.Com Guide to Streaming Multimedia
Web Developer.Com Guide to Streaming Multimedia
FreeWalk: A 3D Virtual Space for Casual Meetings
IEEE MultiMedia
A comparative survey of synchronization algorithms for continuous media in network environments
LCN '00 Proceedings of the 25th Annual IEEE Conference on Local Computer Networks
A Quality Control Mechanism for Networked Virtual Reality System with Video Capability
ICMCS '98 Proceedings of the IEEE International Conference on Multimedia Computing and Systems
A synchronization mechanism for continuous media in multimedia communications
INFOCOM '95 Proceedings of the Fourteenth Annual Joint Conference of the IEEE Computer and Communication Societies (Vol. 3)-Volume - Volume 3
A media synchronization survey: reference model, specification, and case studies
IEEE Journal on Selected Areas in Communications
Surveys of exhibition planners and visitors about a distributed haptic museum
Proceedings of the 2005 ACM SIGCHI International Conference on Advances in computer entertainment technology
Subjective assessment of fairness among users in multipoint communications
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
Proceedings of Workshop on Mobile Video Delivery
Hi-index | 0.00 |
This paper studies media synchronization control between voice and movement of avatars constructed by computer graphics (CG) in networked virtual environments. For the control, we adopt the virtual-time rendering (VTR) media synchronization algorithm, which the authors previously proposed. In the VTR algorithm, we choose the voice as the master stream and the movement of an avatar as the slave stream. By carrying out an experiment in which we interactively move both arms of the avatar synchronously with the voice, we assess the media synchronization quality and demonstrate the effectiveness of the media synchronization control.