Findings from observational studies of collaborative work
International Journal of Man-Machine Studies - Computer-supported cooperative work and groupware. Part 1
One is not enough: multiple views in a media space
INTERCHI '93 Proceedings of the INTERCHI '93 conference on Human factors in computing systems
GestureCam: a video communication system for sympathetic remote collaboration
CSCW '94 Proceedings of the 1994 ACM conference on Computer supported cooperative work
Collaboration in performance of physical tasks: effects on outcomes and communication
CSCW '96 Proceedings of the 1996 ACM conference on Computer supported cooperative work
International Journal of Human-Computer Studies
Inferring intent in eye-based interfaces: tracing eye movements with process models
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Video helps remote work: speakers who need to negotiate common ground benefit from seeing each other
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
Coordination of communication: effects of shared visual context on collaborative work
CSCW '00 Proceedings of the 2000 ACM conference on Computer supported cooperative work
GestureMan: a mobile robot that embodies a remote instructor's actions
CSCW '00 Proceedings of the 2000 ACM conference on Computer supported cooperative work
Eye gaze patterns in conversations: there is more to conversational agents than meets the eyes
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
CSCW '02 Proceedings of the 2002 ACM conference on Computer supported cooperative work
Effects of head-mounted and scene-oriented video systems on remote collaboration on physical tasks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Where do helpers look?: gaze targets during collaborative physical tasks
CHI '03 Extended Abstracts on Human Factors in Computing Systems
Proceedings of the 5th international conference on Multimodal interfaces
Using eye movements to determine referents in a spoken dialogue system
Proceedings of the 2001 workshop on Perceptive user interfaces
Persistence matters: making the most of chat in tightly-coupled work
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Controlling interruptions: awareness displays and social motivation for coordination
CSCW '04 Proceedings of the 2004 ACM conference on Computer supported cooperative work
Action as language in a shared visual space
CSCW '04 Proceedings of the 2004 ACM conference on Computer supported cooperative work
Visual information as a conversational resource in collaborative physical tasks
Human-Computer Interaction
Gestures over video streams to support remote collaboration on physical tasks
Human-Computer Interaction
Modeling focus of attention for meeting indexing based on multiple cues
IEEE Transactions on Neural Networks
Analyzing and predicting focus of attention in remote collaborative tasks
ICMI '05 Proceedings of the 7th international conference on Multimodal interfaces
Comparing remote gesture technologies for supporting collaborative physical tasks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 3rd ACM workshop on Continuous archival and retrival of personal experences
An exploratory analysis of partner action and camera control in a video-mediated collaborative task
CSCW '06 Proceedings of the 2006 20th anniversary conference on Computer supported cooperative work
Proceedings of the 8th international conference on Multimodal interfaces
Gaze analysis in a remote collaborative setting
OZCHI '06 Proceedings of the 18th Australia conference on Computer-Human Interaction: Design: Activities, Artefacts and Environments
Dynamic shared visual spaces: experimenting with automatic camera control in a remote repair task
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
See what i'm saying?: using Dyadic Mobile Eye tracking to study collaborative reference
Proceedings of the ACM 2011 conference on Computer supported cooperative work
A study of gestures in a video-mediated collaborative assembly task
Advances in Human-Computer Interaction
A new approach for cluster detection for large datasets with high dimensionality
DaWaK'05 Proceedings of the 7th international conference on Data Warehousing and Knowledge Discovery
How social cues shape task coordination and communication
Proceedings of the 17th ACM conference on Computer supported cooperative work & social computing
Quantitative evaluation of media space configuration in a task-oriented remote conference system
Proceedings of the 25th Australian Computer-Human Interaction Conference: Augmentation, Application, Innovation, Collaboration
Hi-index | 0.01 |
Helpers providing guidance for collaborative physical tasks shift their gaze between the workspace, supply area, and instructions. Understanding when and why helpers gaze at each area is important both for a theoretical understanding of collaboration on physical tasks and for the design of automated video systems for remote collaboration. In a laboratory experiment using a collaborative puzzle task, we recorded helpers' gaze while manipulating task complexity and piece differentiability. Helpers gazed toward the pieces bay more frequently when pieces were difficult to differentiate and less frequently over repeated trials. Preliminary analyses of message content show that helpers tend to look at the pieces bay when describing the next piece and at the workspace when describing where it goes. The results are consistent with a grounding model of communication, in which helpers seek visual evidence of understanding unless they are confident that they have been understood. The results also suggest the feasibility of building automated video systems based on remote helpers' shifting visual requirements.