Partitioning digital worlds: focal and peripheral awareness in multiple monitor use
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Designing attentive cell phone using wearable eyecontact sensors
CHI '02 Extended Abstracts on Human Factors in Computing Systems
Change Blind Information Display for Ubiquitous Computing Environments
UbiComp '02 Proceedings of the 4th international conference on Ubiquitous Computing
Using the Experience Sampling Method to Evaluate Ubicomp Applications
IEEE Pervasive Computing
Robust Real-Time Face Detection
International Journal of Computer Vision
A model for notification systems evaluation—assessing user goals for multitasking activity
ACM Transactions on Computer-Human Interaction (TOCHI)
Maintaining concentration to achieve task completion
DUX '05 Proceedings of the 2005 conference on Designing for User eXperience
Evaluating user interface systems research
Proceedings of the 20th annual ACM symposium on User interface software and technology
Usability evaluation considered harmful (some of the time)
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Comparing usage of a large high-resolution display to single or dual desktop displays for daily work
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Gazemarks: gaze-based visual placeholders to ease attention switching
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Put them where? towards guidelines for positioning large displays in interactive workspaces
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Designing context-aware display ecosystems
Proceedings of the companion publication of the 2013 international conference on Intelligent user interfaces companion
A new interaction technique involving eye gaze tracker and scanning system
Proceedings of the 2013 Conference on Eye Tracking South Africa
Proceedings of the 19th international conference on Intelligent User Interfaces
Hi-index | 0.00 |
This paper explores techniques for visualising display changes in multi-display environments. We present four subtle gaze-dependent techniques for visualising change on unattended displays called FreezeFrame, PixMap, WindowMap and Aura. To enable the techniques to be directly deployed to workstations, we also present a system that automatically identifies the user's eyes using computer vision and a set of web cameras mounted on the displays. An evaluation confirms this system can detect which display the user is attending to with high accuracy. We studied the efficacy of the visualisation techniques in a five-day case study with a working professional. This individual used our system eight hours per day for five consecutive days. The results of the study show that the participant found the system and the techniques useful, subtle, calm and non-intrusive. We conclude by discussing the challenges in evaluating intelligent subtle interaction techniques using traditional experimental paradigms.