ClearBoard: a seamless medium for shared drawing and conversation with eye contact
CHI '92 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Original Contribution: Stacked generalization
Neural Networks
The GAZE groupware system: mediating joint attention in multiparty communication and collaboration
Proceedings of the SIGCHI conference on Human Factors in Computing Systems
The effects of workspace awareness support on the usability of real-time distributed groupware
ACM Transactions on Computer-Human Interaction (TOCHI)
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Normalization as a Preprocessing Engine for Data Mining and the Approach of Preference Matrix
DEPCOS-RELCOMEX '06 Proceedings of the International Conference on Dependability of Computer Systems
Human computing and machine understanding of human behavior: a survey
Proceedings of the 8th international conference on Multimodal interfaces
Proceedings of the 2008 symposium on Eye tracking research & applications
Group mirrors to support interaction regulation in collaborative problem solving
Computers & Education
Gaze-X: adaptive, affective, multimodal interface for single-user office scenarios
ICMI'06/IJCAI'07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing
RealTourist: a study of augmenting human-human and human-computer dialogue with eye-gaze overlay
INTERACT'05 Proceedings of the 2005 IFIP TC13 international conference on Human-Computer Interaction
Improving classification with class-independent quality measures: Q-stack in face verification
ICB'07 Proceedings of the 2007 international conference on Advances in Biometrics
Hi-index | 0.00 |
The use of dual eye-tracking is investigated in a collaborative game setting. Social context influences individual gaze and action during a collaborative Tetris game: results show that experts as well as novices adapt their playing style when interacting in mixed ability pairs. The long term goal of our work is to design adaptive gaze awareness tools that take the pair composition into account. We therefore investigate the automatic detection (or recognition) of pair composition using dual gaze-based as well as action-based multimodal features. We describe several methods for the improvement of detection (or recognition) and experimentally demonstrate their effectiveness, especially in the situations when the collected gaze data are noisy.