BEAT: the Behavior Expression Animation Toolkit
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
International Journal of Human-Computer Studies - Special issue: Subtle expressivity for characters and robots
Synthesizing multimodal utterances for conversational agents: Research Articles
Computer Animation and Virtual Worlds
Gesture modeling and animation based on a probabilistic re-creation of speaker style
ACM Transactions on Graphics (TOG)
Explorations in engagement for humans and robots
Artificial Intelligence
Cooperative gestures: effective signaling for humanoid robots
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Minimal set of recognizable gestures for a 10 DOF anthropomorphic robot
ICSR'10 Proceedings of the Second international conference on Social robotics
Vision-based arm gesture recognition for a long-range human---robot interaction
The Journal of Supercomputing
Hi-index | 0.00 |
The human sciences have demonstrated that gesture is a critical element of human communication. While existing graphical solutions are appropriate for virtual agents, solving arm trajectories for physically embodied robots requires that we consider the challenges of robot dynamics within a real-time gesture framework. We explore and evaluate a low computational-cost gesture production algorithm that can generate adequate gesture trajectories in a humanoid torso, as judged by participants in Human-Robot gesturing studies presented in this paper. Our approach produces a constrained inverse-kinematic solution for the start and end points, and generates appropriate wrist angles. Gesture time is used to calculate the joint accelerations to give a smooth, direct hand movement. Selecting open hand gestures as an example gesture sub-domain, we implement our controller on BERTI, a bespoke upper-torso humanoid robot (Fig. 2). A qualitative pilot study highlights gesture features salient to users: gesture shape, timing, naturalness and smoothness. A controlled experimental study then demonstrates that, by these metrics, our algorithm performs well; despite some dissimilarities with users' own gestures. We establish some salient points of robot gestures based on these studies.