IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use
International Journal of Human-Computer Interaction
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Identifying fixations and saccades in eye-tracking protocols
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Vehicle Teleoperation Interfaces
Autonomous Robots
Teleoperation User Interfaces for Mining Robotics
Autonomous Robots
FlySPEC: a multi-user video camera system with hybrid human and automatic control
Proceedings of the tenth ACM international conference on Multimedia
Wheelesley: A Robotic Wheelchair System: Indoor Navigation and User Interface
Assistive Technology and Artificial Intelligence, Applications in Robotics, User Interfaces and Natural Language Processing
Dynamical system representation, generation, and recognition of basic oscillatory motion gestures
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Robotic camera control for remote exploration
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Using Collaborative Haptics in Remote Surgical Training
WHC '05 Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
Conversing with the user based on eye-gaze patterns
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Face-tracking as an augmented input in video games: enhancing presence, role-playing and control
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Noise tolerant selection by gaze-controlled pan and zoom in 3D
Proceedings of the 2008 symposium on Eye tracking research & applications
Lean and zoom: proximity-aware user interface and content magnification
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
The design of natural interaction
Multimedia Tools and Applications
The inspection of very large images by eye-gaze control
AVI '08 Proceedings of the working conference on Advanced visual interfaces
Ptz control with head tracking for video chat
CHI '09 Extended Abstracts on Human Factors in Computing Systems
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Keyboard before Head Tracking Depresses User Success in Remote Camera Control
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Natural interaction enhanced remote camera control for teleoperation
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Human Performance Issues and User Interface Design for Teleoperated Robots
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Developing a situated virtual reality simulation for telerobotic control and training
Edutainment'12/GameDays'12 Proceedings of the 7th international conference on Edutainment, and Proceedings of the 3rd international conference on E-Learning and Games for Training, Education, Health and Sports
Chili: viewpoint control and on-video drawing for mobile video calls
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Interface design for minimizing loss of context in in-situ remote robot control
HCI'13 Proceedings of the 15th international conference on Human-Computer Interaction: users and contexts of use - Volume Part III
Supporting augmented reality based children's play with pro-cam robot: three user perspectives
Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
SPRinT: smart phone based navigation for natural human robot interaction and tele-presence
Proceedings of the 12th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and Its Applications in Industry
Hi-index | 0.01 |
Control of camera viewpoint plays a vital role in many teleoperation activities, as watching live video streams is still the fundamental way for operators to obtain situational awareness from remote environments. Motivated by a real-world industrial setting in mining teleoperation, we explore several possible solutions to resolve a common multi-tasking situation where an operator is required to control a robot and simultaneously perform remote camera operation. Conventional control interfaces are predominantly used in such teleoperation settings, but could overload an operator's hand-operation capability, and require frequent attention switches and thus could decrease productivity. We report on an empirical user study in a model multi-tasking teleoperation setting where the user has a main task which requires their attention. We compare three different camera viewpoint control models: (1) dual manual control, (2) natural interaction (combining eye gaze and head motion) and (3) autonomous tracking. The results indicate the advantages of using the natural interaction model, while the manual control model performed the worst.