Managing robot autonomy and interactivity using motives and visual communication
Proceedings of the third annual conference on Autonomous Agents
Infant-like social interactions between a robot and a human caregiver
Adaptive Behavior
An Behavior-based Robotics
Reinforcement Learning in the Multi-Robot Domain
Autonomous Robots
Experiences with an Autonomous Robot Attending AAAI
IEEE Intelligent Systems
Color-Based Probabilistic Tracking
ECCV '02 Proceedings of the 7th European Conference on Computer Vision-Part I
AI Magazine
2003 AAAI robot competition and exhibition
AI Magazine
ASX: an object-oriented framework for developing distributed applications
CTEC'94 Proceedings of the 6th conference on USENIX Sixth C++ Technical Conference - Volume 6
Autonomous mobile robot that can read
EURASIP Journal on Applied Signal Processing
Interleaving temporal planning and execution in robotics domains
AAAI'04 Proceedings of the 19th national conference on Artifical intelligence
Reactive planning in a motivated behavioral architecture
AAAI'05 Proceedings of the 20th national conference on Artificial intelligence - Volume 3
Improving Situated Agents Adaptability Using Interruption Theory of Emotions
SAB '08 Proceedings of the 10th international conference on Simulation of Adaptive Behavior: From Animals to Animats
Engineering intelligent information-processing systems with CAST
Advanced Engineering Informatics
Evaluating real-time audio localization algorithms for artificial audition in robotics
IROS'09 Proceedings of the 2009 IEEE/RSJ international conference on Intelligent robots and systems
A survey of motivation frameworks for intelligent systems
Artificial Intelligence
Autonomous Robots
Hi-index | 0.00 |
Spartacus is our robot entry in the 2005 AAAI Mobile Robot Challenge, making a robot attend the National Conference on Artificial Intelligence. Designing robots that are capable of interacting with humans in real-life settings can be considered the ultimate challenge when it comes to intelligent autonomous systems. One key issue is the integration of multiple modalities (e.g., mobility, physical structure, navigation, vision, audition, dialogue, reasoning). Such integration increases the diversity and also the complexity of interactions the robot can generate. It also makes it difficult to monitor how such increased capabilities are used in unconstrained conditions, whether it is done while the robot is in operation of afterwards. This paper reports solutions and findings resulting from our hardware, software and decisional integration work on Spartacus. It also outlines perspectives in making intelligent and interaction capabilities evolve for autonomous robots.