BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Formational Parameters and Adaptive Prototype Instantiation for MPEG-4 Compliant Gesture Synthesis
CA '02 Proceedings of the Computer Animation
Gesture modeling and animation based on a probabilistic re-creation of speaker style
ACM Transactions on Graphics (TOG)
Proceedings of The 8th International Conference on Autonomous Agents and Multiagent Systems - Volume 1
Applying computational models of spatial prepositions to visually situated dialog
Computational Linguistics
Enriching agent animations with gestures and highlighting effects
IMTCI'04 Proceedings of the Second international conference on Intelligent Media Technology for Communicative Intelligence
Guest Editorial: Gesture and speech in interaction: An overview
Speech Communication
Hi-index | 0.00 |
Research on gesture generation for embodied conversational agents (ECA’s) mostly focuses on gesture types such as pointing and iconic gestures, while ignoring another gesture type frequently used by human speakers: beat gestures. Analysis of a corpus of route descriptions showed that although annotators show very low agreement in applying a ‘beat filter’ aimed at identifying physical features of beat gestures, they are capable of reliably distinguishing beats from other gestures in a more intuitive manner. Beat gestures made up more than 30% of the gestures in our corpus, and they were sometimes used when expressing concepts for which other gesture types seemed a more obvious choice. Based on these findings we propose a simple, probabilistic model of beat production for ECA’s. However, it is clear that more research is needed to determine why direction givers in some cases use beats when other gestures seem more appropriate, and vice versa.