A Model of Saliency-Based Visual Attention for Rapid Scene Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Visual attention in spoken human-robot interaction
Proceedings of the 4th ACM/IEEE international conference on Human robot interaction
Investigating multimodal real-time patterns of joint attention in an hri word learning task
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
EMNLP '10 Proceedings of the 2010 Conference on Empirical Methods in Natural Language Processing
Human Performance Issues and User Interface Design for Teleoperated Robots
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Hi-index | 0.00 |
Collaboration between distributed human and robot partners during military operations is becoming more necessary. In order to enable efficient real-time communication, it is important to develop user interfaces that support robust spoken language understanding capabilities. As a step toward achieving this objective, this work examines the role of shared gaze between a human and robot during remote spoken collaboration engaged in a distributed military operations. Preliminary results have shown that an interface that supports shared gaze between a human and robot for a remote collaborative HRI search task has potential to improve automated language understanding as well as task efficiency.