Speech Communication - Special issue on speech processing in adverse conditions
Integration of speech and vision using mutual information
ICASSP '00 Proceedings of the Acoustics, Speech, and Signal Processing, 2000. on IEEE International Conference - Volume 04
Proceedings of the 13th international conference on Intelligent user interfaces
Multimodal integration-a statistical view
IEEE Transactions on Multimedia
Hi-index | 0.00 |
In natural interaction, gaze assumes a variety of roles that may need distinguishing between in a gaze-contingent interface. Information from other modalities has potential to make distinction easier. In this study, Mutual Information (MI) is proposed as a variable to distinguish gaze roles by using information from both gaze and speech. A pilot experiment is conducted where different gaze behaviours are elicited from people using acoustic noise. Initial results demonstrate that MI distinguishes between gaze roles moreso than gaze characteristics alone. This work shows potential for using MI as a variable to help distinguish gaze roles in multimodal interfaces and highlights the requirement to account for acoustic noise when interpreting gaze in human-machine interactions.