UI for a videoconference camera
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Vision-based face tracking system for window interface: prototype application and empirical studies
CHI '01 Extended Abstracts on Human Factors in Computing Systems
Shared interactive video for teleconferencing
MULTIMEDIA '03 Proceedings of the eleventh ACM international conference on Multimedia
Design issues for vision-based computer interaction systems
Proceedings of the 2001 workshop on Perceptive user interfaces
Combining head tracking and mouse input for a GUI on multiple monitors
CHI '05 Extended Abstracts on Human Factors in Computing Systems
Face-tracking as an augmented input in video games: enhancing presence, role-playing and control
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multimodal human-computer interaction: A survey
Computer Vision and Image Understanding
Meta-perception: reflexes and bodies as part of the interface
CHI '08 Extended Abstracts on Human Factors in Computing Systems
"Moving to the centre": A gaze-driven remote camera control for teleoperation
Interacting with Computers
Exploring camera viewpoint control models for a multi-tasking setting in teleoperation
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
DS-RT '11 Proceedings of the 2011 IEEE/ACM 15th International Symposium on Distributed Simulation and Real Time Applications
Hi-index | 0.00 |
This paper describes a user interface for video chat that is capable of panning, tilting, and zooming (PTZ) operation using head tracking. The approach is to map a captured 3D position from head tracker to PTZ parameters of a remote camera so that a user can intuitively change the view just as people change their sight by moving their head. The preliminary user study gave encouraging results and clarified the point for further improvement.