Interacting with eye movements in virtual environments
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
3D eye movement analysis for VR visual inspection training
ETRA '02 Proceedings of the 2002 symposium on Eye tracking research & applications
Visual Attention towards Gestures in Face-to-Face Interaction vs. on Screen
GW '01 Revised Papers from the International Gesture Workshop on Gesture and Sign Languages in Human-Computer Interaction
Visual deictic reference in a collaborative virtual environment
Proceedings of the 2004 symposium on Eye tracking research & applications
Building a lightweight eyetracking headgear
Proceedings of the 2004 symposium on Eye tracking research & applications
openEyes: a low-cost head-mounted eye-tracking solution
Proceedings of the 2006 symposium on Eye tracking research & applications
ACM Transactions on Applied Perception (TAP)
Proceedings of the 2010 Symposium on Eye-Tracking Research & Applications
Gaze as a supplementary modality for interacting with ambient intelligence environments
UAHCI'07 Proceedings of the 4th international conference on Universal access in human-computer interaction: ambient interaction
Proceedings of the 1st international workshop on pervasive eye tracking & mobile eye-based interaction
Measuring and visualizing attention in space with 3D attention volumes
Proceedings of the Symposium on Eye Tracking Research and Applications
Gaze guided object recognition using a head-mounted eye tracker
Proceedings of the Symposium on Eye Tracking Research and Applications
Ego-motion compensation improves fixation detection in wearable eye tracking
Proceedings of the Symposium on Eye Tracking Research and Applications
On the conspicuity of 3-D fiducial markers in 2-D projected environments
Proceedings of the Symposium on Eye Tracking Research and Applications
3D attention: measurement of visual saliency using eye tracking glasses
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
For validly analyzing human visual attention, it is often necessary to proceed from computer-based desktop set-ups to more natural real-world settings. However, the resulting loss of control has to be counterbalanced by increasing participant and/or item count. Together with the effort required to manually annotate the gaze-cursor videos recorded with mobile eye trackers, this renders many studies unfeasible. We tackle this issue by minimizing the need for manual annotation of mobile gaze data. Our approach combines geometric modelling with inexpensive 3D marker tracking to align virtual proxies with the real-world objects. This allows us to classify fixations on objects of interest automatically while supporting a completely free moving participant. The paper presents the EyeSee3D method as well as a comparison of an expensive outside-in (external cameras) and a low-cost inside-out (scene camera) tracking of the eye-tracker's position. The EyeSee3D approach is evaluated comparing the results from automatic and manual classification of fixation targets, which raises old problems of annotation validity in a modern context.