Cognitive Status and Form of Reference in Multimodal Human-Computer Interaction
Proceedings of the Seventeenth National Conference on Artificial Intelligence and Twelfth Conference on Innovative Applications of Artificial Intelligence
Referring to Objects with Spoken and Haptic Modalities
ICMI '02 Proceedings of the 4th IEEE International Conference on Multimodal Interfaces
A machine learning approach to pronoun resolution in spoken dialogue
ACL '03 Proceedings of the 41st Annual Meeting on Association for Computational Linguistics - Volume 1
The roles of haptic-ostensive referring expressions in cooperative, task-based human-robot dialogue
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
The Stanford typed dependencies representation
CrossParser '08 Coling 2008: Proceedings of the workshop on Cross-Framework and Cross-Domain Parser Evaluation
Gesture improves coreference resolution
NAACL-Short '06 Proceedings of the Human Language Technology Conference of the NAACL, Companion Volume: Short Papers
The WEKA data mining software: an update
ACM SIGKDD Explorations Newsletter
LIBSVM: A library for support vector machines
ACM Transactions on Intelligent Systems and Technology (TIST)
Improving pronominal and deictic co-reference resolution with multi-modal features
SIGDIAL '11 Proceedings of the SIGDIAL 2011 Conference
Improving sentence completion in dialogues with multi-modal features
SIGDIAL '12 Proceedings of the 13th Annual Meeting of the Special Interest Group on Discourse and Dialogue
Hi-index | 0.00 |
This paper describes our ongoing work on resolving third person pronouns and deictic words in a multi-modal corpus. We show that about two thirds of these referring expressions have antecedents that are introduced by pointing gestures or by haptic-ostensive actions (actions that involve manipulating an object). After describing our annotation scheme, we discuss the co-reference models we learn from multi-modal features. The usage of haptic-ostensive actions in a co-reference model is a novel contribution of our work.