The use of eye movements in human-computer interaction techniques: what you look at is what you get
ACM Transactions on Information Systems (TOIS) - Special issue on computer—human interaction
Manual and gaze input cascaded (MAGIC) pointing
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Interacting with eye movements in virtual environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Identifying fixations and saccades in eye-tracking protocols
ETRA '00 Proceedings of the 2000 symposium on Eye tracking research & applications
Vehicle Teleoperation Interfaces
Autonomous Robots
Teleoperation User Interfaces for Mining Robotics
Autonomous Robots
Wheelesley: A Robotic Wheelchair System: Indoor Navigation and User Interface
Assistive Technology and Artificial Intelligence, Applications in Robotics, User Interfaces and Natural Language Processing
Dynamical system representation, generation, and recognition of basic oscillatory motion gestures
FG '96 Proceedings of the 2nd International Conference on Automatic Face and Gesture Recognition (FG '96)
Robotic camera control for remote exploration
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Face-tracking as an augmented input in video games: enhancing presence, role-playing and control
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Use of eye movements for video game control
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
EyePoint: practical pointing and selection using gaze and keyboard
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Snap clutch, a moded approach to solving the Midas touch problem
Proceedings of the 2008 symposium on Eye tracking research & applications
Eye gaze assistance for a game-like interactive task
International Journal of Computer Games Technology
Ptz control with head tracking for video chat
CHI '09 Extended Abstracts on Human Factors in Computing Systems
CHI '09 Extended Abstracts on Human Factors in Computing Systems
Keyboard before Head Tracking Depresses User Success in Remote Camera Control
INTERACT '09 Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part II
Natural interaction enhanced remote camera control for teleoperation
CHI '10 Extended Abstracts on Human Factors in Computing Systems
Designing gaze-based user interfaces for steering in virtual environments
Proceedings of the Symposium on Eye Tracking Research and Applications
Investigating gaze-supported multimodal pan and zoom
Proceedings of the Symposium on Eye Tracking Research and Applications
Demo of gaze controlled flying
Proceedings of the 7th Nordic Conference on Human-Computer Interaction: Making Sense Through Design
Developing a situated virtual reality simulation for telerobotic control and training
Edutainment'12/GameDays'12 Proceedings of the 7th international conference on Edutainment, and Proceedings of the 3rd international conference on E-Learning and Games for Training, Education, Health and Sports
Shape recognition of laser beam trace for human-robot interface
Pattern Recognition Letters
Hi-index | 0.00 |
In general, conventional control interfaces such as joysticks, switches, and wheels are predominantly used in teleoperation. However, operators normally have to control multiple complex devices simultaneously. For example, controlling a rock breaker and a remote camera at the same time in mining teleoperation. This overloads the operator's control capability of using hands, increases workload and reduces productivity. We present a novel gaze-driven remote camera control with an implemented prototype, which follows a simple and natural design principle: ''Whatever you look at on the screen, it moves to the centre!''. A user study of modeled hands-busy experiment has been conducted, comparing the performance of using gaze-driven control and traditional joystick control through both objective measures and subjective measures. The experimental results clearly show the gaze-driven control significantly outperformed the conventional joystick control.