How may I serve you?: a robot companion approaching a seated person in a helping context
Proceedings of the 1st ACM SIGCHI/SIGART conference on Human-robot interaction
Natural person-following behavior for social robots
Proceedings of the ACM/IEEE international conference on Human-robot interaction
Design patterns for sociality in human-robot interaction
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
How close?: model of proximity control for information-presenting robots
Proceedings of the 3rd ACM/IEEE international conference on Human robot interaction
Reconfiguring spatial formation arrangement by robot body orientation
Proceedings of the 5th ACM/IEEE international conference on Human-robot interaction
Designing engagement-aware agents for multiparty conversations
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
May i talk about other shops here?: modeling territory and invasion in front of shops
Proceedings of the 2014 ACM/IEEE international conference on Human-robot interaction
Hi-index | 0.00 |
In this paper, we present our current research on developing a model of robot behavior that leads to feelings of being together using the robot's body position and orientation. Creating feelings of"being together"will be an essential skill for robots that live with humans and adapt to daily human activities such as walking together or establishing joint attention to information in the environment. We observe people's proxemic behavior in joint attention situations and develop a model of behavior for robots to detect a partner's attention shift and appropriately adjust its body position and orientation in establishing joint attention with the partner. We experimentally evaluate the effectiveness of our model, and our results demonstrate the model's effectiveness.