International Journal of Computer Vision
Pfinder: Real-Time Tracking of the Human Body
IEEE Transactions on Pattern Analysis and Machine Intelligence
Artificial intelligence and mobile robots: case studies of successful robot systems
Artificial intelligence and mobile robots: case studies of successful robot systems
Map learning and high-speed navigation in RHINO
Artificial intelligence and mobile robots
The interactive museum tour-guide robot
AAAI '98/IAAI '98 Proceedings of the fifteenth national/tenth conference on Artificial intelligence/Innovative applications of artificial intelligence
Neural Network Perception for Mobile Robot Guidance
Neural Network Perception for Mobile Robot Guidance
Navigating Mobile Robots: Systems and Techniques
Navigating Mobile Robots: Systems and Techniques
Active Face Tracking and Pose Estimation in an Interactive Room
CVPR '96 Proceedings of the 1996 Conference on Computer Vision and Pattern Recognition (CVPR '96)
Gesture recognition using the Perseus architecture
CVPR '96 Proceedings of the 1996 Conference on Computer Vision and Pattern Recognition (CVPR '96)
A Mobile Robot That Recognizes People
TAI '95 Proceedings of the Seventh International Conference on Tools with Artificial Intelligence
Socially embedded learning of the office-conversant mobile robot Jijo-2
IJCAI'97 Proceedings of the Fifteenth international joint conference on Artifical intelligence - Volume 2
An architecture for vision and action
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 1
Recognizing and interpreting gestures on a mobile robot
AAAI'96 Proceedings of the thirteenth national conference on Artificial intelligence - Volume 2
A Gesture Based Interface for Human-Robot Interaction
Autonomous Robots
Social Interaction of Humanoid RobotBased on Audio-Visual Tracking
IEA/AIE '02 Proceedings of the 15th international conference on Industrial and engineering applications of artificial intelligence and expert systems: developments in applied artificial intelligence
Realizing Audio-Visually Triggered ELIZA-Like Non-verbal Behaviors
PRICAI '02 Proceedings of the 7th Pacific Rim International Conference on Artificial Intelligence: Trends in Artificial Intelligence
Design and implementation of personality of humanoids in human humanoid non-verbal interaction
IEA/AIE'2003 Proceedings of the 16th international conference on Developments in applied artificial intelligence
Real-time auditory and visual multiple-object tracking for humanoids
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Decomposition of human motion into dynamics-based primitives with application to drawing tasks
Automatica (Journal of IFAC)
Hi-index | 0.00 |
For mobile robots to assist people in everyday life, they must be easy to instruct. This paper describes a gesture-based interface for human robot interaction, which enables people to instruct robots through easy-to-perform arm gestures. Such gestures might be static pose gestures, which involve only a specific configuration of the person's arm, or they might be dynamic motion gestures (such as waving). Gestures are recognized in real-time at approximate frame rate, using a hybrid approach that integrates neural networks and template matching. A fast, color-based tracking algorithm enables the robot to track and follow a person reliably through office environments with drastically changing lighting conditions. Results are reported in the context of an interactive clean-up task, where a person guides the robot to specific locations that need to be cleaned, and the robot picks up trash which it then delivers to the nearest trash-bin.