International Journal of Human-Computer Studies - Notification user interfaces
An introduction to ROC analysis
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Predicting postcompletion errors using eye movements
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
A model for types and levels of human interaction with automation
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Validating human-robot interaction schemes in multitasking environments
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Predicting Controller Capacity in Supervisory Control of Multiple UAVs
IEEE Transactions on Systems, Man, and Cybernetics, Part A: Systems and Humans
Hard lessons learned: mobile eye-tracking in cockpits
Proceedings of the 4th Workshop on Eye Gaze in Intelligent Human Machine Interaction
Supervisory guide part I: detecting gaps in UAV swarm operator situation awareness
CHI '13 Extended Abstracts on Human Factors in Computing Systems
Hi-index | 0.00 |
For a single operator to effectively control multiple robots, operator situation awareness is a critical component of the human-robot system. There are three levels of situation awareness: perception, comprehension, and projection into the future [1]. We focus on the perception level to develop a theoretic model of the perceptual-cognitive processes underlying situation awareness. Eye movement measures were developed as indicators of cognitive processing and these measures were used to account for operator situation awareness on a supervisory control task. The eye movement based model emphasizes the importance of visual scanning and attention allocation as the cognitive processes that lead to operator situation awareness and the model lays the groundwork for real-time prediction of operator situation awareness.