Dynamic Time Warping for Off-Line Recognition of a Small Gesture Vocabulary
RATFG-RTS '01 Proceedings of the IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems (RATFG-RTS'01)
Position-invariant, real-time gesture recognition based on dynamic time warping
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Position-invariant, real-time gesture recognition based on dynamic time warping
Proceedings of the 8th ACM/IEEE international conference on Human-robot interaction
Human-robot interaction through 3D vision and force control
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
For a human-robot interaction to take place, a robot needs to perceive humans. The space where a robot can perceive humans is restrained by the limitations of robot's sensors. These restrictions can be circumvented by the use of external sensors, like in intelligent environments; otherwise humans have to ensure that they can be perceived. With the robotic platform presented here, the roles are reversed and the robot autonomously ensures that the human is within the area perceived by the robot. This is achieved by a combination of hardware and algorithms capable of autonomously tracking the person, estimating their position and following them, while recognizing their gestures and moving through space.