On the representation and estimation of spatial uncertainly
International Journal of Robotics Research
Fixation maps: quantifying eye-movement traces
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Estimation of 3D Gazed Position Using View Lines
ICIAP '03 Proceedings of the 12th International Conference on Image Analysis and Processing
Active Wearable Vision Sensor: Recognition of Human Activities and Environments
ICKS '04 Proceedings of the International Conference on Informatics Research for Development of Knowledge Society Infrastructure
openEyes: a low-cost head-mounted eye-tracking solution
Proceedings of the 2006 symposium on Eye tracking research & applications
MonoSLAM: Real-Time Single Camera SLAM
IEEE Transactions on Pattern Analysis and Machine Intelligence
3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker
Proceedings of the 2008 symposium on Eye tracking research & applications
Parallel Tracking and Mapping for Small AR Workspaces
ISMAR '07 Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality
Machine learning for high-speed corner detection
ECCV'06 Proceedings of the 9th European conference on Computer Vision - Volume Part I
3D point of gaze estimation using head-mounted RGB-D cameras
Proceedings of the 14th international ACM SIGACCESS conference on Computers and accessibility
Multi-modal interfaces for control of assistive robotic devices
Proceedings of the 14th ACM international conference on Multimodal interaction
Multi-modal object of interest detection using eye gaze and RGB-D cameras
Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
3D from looking: using wearable gaze tracking for hands-free and feedback-free object modelling
Proceedings of the 2013 International Symposium on Wearable Computers
Hi-index | 0.00 |
The portability of an eye tracking system encourages us to develop a technique for estimating 3D point-of-regard. Unlike conventional methods, which estimate the position in the 2D image coordinates of the mounted camera, such a technique can represent richer gaze information of the human moving in the larger area. In this paper, we propose a method for estimating the 3D point-of-regard and a visualization technique of gaze trajectories under natural head movements for the head-mounted device. We employ visual SLAM technique to estimate head configuration and extract environmental information. Even in cases where the head moves dynamically, the proposed method could obtain 3D point-of-regard. Additionally, gaze trajectories are appropriately overlaid on the scene camera image.