CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Multi-finger gestural interaction with 3d volumetric displays
Proceedings of the 17th annual ACM symposium on User interface software and technology
Efficient eye pointing with a fisheye lens
GI '05 Proceedings of Graphics Interface 2005
Using eye tracking to analyze stereoscopic filmmaking
SIGGRAPH '09: Posters
Making use of drivers' glances onto the screen for explicit gaze-based interaction
Proceedings of the 2nd International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Exploring the potential for touchless interaction in image-guided interventional radiology
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Evaluating depth illusion as method of adding emphasis in autostereoscopic mobile displays
Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services
Stereoscopic Highlighting: 2D Graph Visualization on Stereo Displays
IEEE Transactions on Visualization and Computer Graphics
Measuring gaze depth with an eye tracker during stereoscopic display
Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in Graphics and Visualization
HoloDesk: direct 3d interactions with a situated see-through display
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Is stereoscopic 3D a better choice for information representation in the car?
Proceedings of the 4th International Conference on Automotive User Interfaces and Interactive Vehicular Applications
Studying user experiences of autostereoscopic 3D menu on touch screen mobile device
Proceedings of the 24th Australian Computer-Human Interaction Conference
Design and evaluation of mobile phonebook application with stereoscopic 3D user interface
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 2013 ACM international joint conference on Pervasive and ubiquitous computing
Hi-index | 0.00 |
In this paper, we investigate the concept of gaze-based interaction with 3D user interfaces. We currently see stereo vision displays becoming ubiquitous, particularly as auto-stereoscopy enables the perception of 3D content without the use of glasses. As a result, application areas for 3D beyond entertainment in cinema or at home emerge, including work settings, mobile phones, public displays, and cars. At the same time, eye tracking is hitting the consumer market with low-cost devices. We envision eye trackers in the future to be integrated with consumer devices (laptops, mobile phones, displays), hence allowing the user's gaze to be analyzed and used as input for interactive applications. A particular challenge when applying this concept to 3D displays is that current eye trackers provide the gaze point in 2D only (x and y coordinates). In this paper, we compare the performance of two methods that use the eye's physiology for calculating the gaze point in 3D space, hence enabling gaze-based interaction with stereoscopic content. Furthermore, we provide a comparison of gaze interaction in 2D and 3D with regard to user experience and performance. Our results show that with current technology, eye tracking on stereoscopic displays is possible with similar performance as on standard 2D screens.